Apr 20 20:11:07.206859 ip-10-0-134-66 systemd[1]: Starting Kubernetes Kubelet... Apr 20 20:11:07.636927 ip-10-0-134-66 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:11:07.636927 ip-10-0-134-66 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 20:11:07.636927 ip-10-0-134-66 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:11:07.636927 ip-10-0-134-66 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 20:11:07.636927 ip-10-0-134-66 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:11:07.638483 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.638420 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 20:11:07.641192 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641178 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:11:07.641192 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641193 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641196 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641199 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641202 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641205 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641208 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641210 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641213 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641216 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641218 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641221 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641223 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641226 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641229 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641231 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641233 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641240 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641243 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641246 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641248 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:11:07.641250 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641251 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641253 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641256 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641259 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641261 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641264 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641266 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641269 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641271 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641273 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641276 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641278 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641281 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641283 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641286 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641288 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641290 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641293 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641295 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:11:07.641806 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641298 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641300 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641303 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641307 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641311 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641313 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641317 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641321 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641323 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641326 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641328 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641331 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641333 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641335 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641338 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641341 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641344 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641346 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641349 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:11:07.642275 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641351 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641354 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641356 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641358 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641361 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641363 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641366 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641368 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641370 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641373 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641376 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641378 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641381 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641383 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641385 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641388 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641390 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641393 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641395 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641397 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:11:07.642748 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641400 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641402 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641404 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641407 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641409 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641412 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641414 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641765 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641772 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641775 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641777 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641780 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641783 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641785 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641788 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641790 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641793 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641795 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641798 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641800 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:11:07.643227 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641803 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641805 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641807 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641810 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641813 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641815 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641818 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641820 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641822 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641826 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641828 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641831 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641833 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641836 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641838 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641841 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641844 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641846 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641848 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641851 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:11:07.643704 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641853 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641856 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641858 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641860 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641863 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641865 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641867 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641870 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641872 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641875 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641877 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641879 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641881 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641885 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641889 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641891 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641894 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641896 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641899 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641901 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:11:07.644168 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641903 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641914 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641917 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641920 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641923 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641925 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641929 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641932 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641935 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641939 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641960 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641964 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641967 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641970 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641972 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641974 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641978 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641980 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641983 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:11:07.644642 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641985 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641988 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641990 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641993 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641995 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.641997 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.642000 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.642002 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.642004 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.642007 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.642009 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.642012 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.642014 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.642017 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643307 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643317 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643323 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643327 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643331 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643334 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643338 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 20:11:07.645082 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643342 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643345 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643348 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643352 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643355 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643358 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643361 2578 flags.go:64] FLAG: --cgroup-root="" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643363 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643366 2578 flags.go:64] FLAG: --client-ca-file="" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643368 2578 flags.go:64] FLAG: --cloud-config="" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643371 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643374 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643378 2578 flags.go:64] FLAG: --cluster-domain="" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643381 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643383 2578 flags.go:64] FLAG: --config-dir="" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643386 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643389 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643393 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643396 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643399 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643403 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643405 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643408 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643411 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643414 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 20:11:07.645590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643418 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643422 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643425 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643427 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643430 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643433 2578 flags.go:64] FLAG: --enable-server="true" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643435 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643440 2578 flags.go:64] FLAG: --event-burst="100" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643443 2578 flags.go:64] FLAG: --event-qps="50" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643445 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643449 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643451 2578 flags.go:64] FLAG: --eviction-hard="" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643455 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643458 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643460 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643463 2578 flags.go:64] FLAG: --eviction-soft="" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643466 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643468 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643471 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643474 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643476 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643479 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643482 2578 flags.go:64] FLAG: --feature-gates="" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643485 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643488 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 20:11:07.646223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643491 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643494 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643497 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643500 2578 flags.go:64] FLAG: --help="false" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643503 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-134-66.ec2.internal" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643506 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643509 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643512 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643516 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643519 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643521 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643524 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643527 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643529 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643532 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643535 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643538 2578 flags.go:64] FLAG: --kube-reserved="" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643540 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643543 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643546 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643549 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643552 2578 flags.go:64] FLAG: --lock-file="" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643554 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643557 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 20:11:07.646825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643560 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643565 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643568 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643571 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643585 2578 flags.go:64] FLAG: --logging-format="text" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643588 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643591 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643594 2578 flags.go:64] FLAG: --manifest-url="" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643596 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643601 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643604 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643608 2578 flags.go:64] FLAG: --max-pods="110" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643610 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643613 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643616 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643619 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643622 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643625 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643628 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643634 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643637 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643640 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643643 2578 flags.go:64] FLAG: --pod-cidr="" Apr 20 20:11:07.647415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643646 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643651 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643653 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643656 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643659 2578 flags.go:64] FLAG: --port="10250" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643662 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643665 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-098bd36e081f859d1" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643668 2578 flags.go:64] FLAG: --qos-reserved="" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643671 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643674 2578 flags.go:64] FLAG: --register-node="true" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643677 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643679 2578 flags.go:64] FLAG: --register-with-taints="" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643683 2578 flags.go:64] FLAG: --registry-burst="10" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643686 2578 flags.go:64] FLAG: --registry-qps="5" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643688 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643691 2578 flags.go:64] FLAG: --reserved-memory="" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643694 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643697 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643700 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643702 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643705 2578 flags.go:64] FLAG: --runonce="false" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643707 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643711 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643713 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643716 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643720 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 20:11:07.647966 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643722 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643725 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643728 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643731 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643733 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643736 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643738 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643741 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643744 2578 flags.go:64] FLAG: --system-cgroups="" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643746 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643751 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643754 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643757 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643760 2578 flags.go:64] FLAG: --tls-min-version="" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643763 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643767 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643770 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643773 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643775 2578 flags.go:64] FLAG: --v="2" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643779 2578 flags.go:64] FLAG: --version="false" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643783 2578 flags.go:64] FLAG: --vmodule="" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643786 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.643789 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643872 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:11:07.648623 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643875 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643878 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643881 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643884 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643887 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643890 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643892 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643895 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643897 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643900 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643903 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643905 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643908 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643910 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643913 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643915 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643917 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643919 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643922 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643924 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:11:07.649491 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643927 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643929 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643932 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643936 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643938 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643941 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643943 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643946 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643948 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643950 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643953 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643955 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643957 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643960 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643962 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643964 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643967 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643971 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643974 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643976 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:11:07.650040 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643979 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643981 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643983 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643986 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643988 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643991 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643993 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643996 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.643998 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644000 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644002 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644004 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644007 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644009 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644011 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644015 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644018 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644020 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644022 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644025 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:11:07.650525 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644027 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644029 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644032 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644034 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644036 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644038 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644041 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644043 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644046 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644048 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644051 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644053 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644055 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644058 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644060 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644064 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644067 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644070 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644072 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:11:07.651035 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644075 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:11:07.651482 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644077 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:11:07.651482 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644080 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:11:07.651482 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644082 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:11:07.651482 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644085 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:11:07.651482 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.644087 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:11:07.651482 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.644092 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:11:07.651801 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.651783 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 20:11:07.651828 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.651802 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 20:11:07.651854 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651845 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:11:07.651854 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651851 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:11:07.651854 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651854 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651857 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651860 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651863 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651865 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651868 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651870 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651873 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651876 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651878 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651881 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651883 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651885 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651888 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651890 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651893 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651895 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651898 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651902 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:11:07.651933 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651906 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651909 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651912 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651914 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651917 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651920 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651922 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651925 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651927 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651930 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651932 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651935 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651938 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651940 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651943 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651945 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651948 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651950 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651953 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651955 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:11:07.652377 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651957 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651960 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651962 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651965 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651967 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651969 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651972 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651974 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651977 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651979 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651981 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651985 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651988 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651991 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651994 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651996 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.651999 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652001 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652003 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652006 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:11:07.652904 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652009 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652011 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652013 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652015 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652018 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652020 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652023 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652025 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652027 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652030 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652032 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652034 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652036 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652039 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652042 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652045 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652047 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652049 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652052 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652054 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:11:07.653381 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652056 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652059 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652061 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652063 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652066 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.652070 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652158 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652163 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652166 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652169 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652172 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652174 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652176 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652179 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652181 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652184 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:11:07.653868 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652186 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652190 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652194 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652196 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652199 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652202 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652204 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652207 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652210 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652213 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652215 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652217 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652220 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652223 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652225 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652228 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652230 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652232 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652235 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:11:07.654291 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652237 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652239 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652242 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652245 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652248 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652250 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652253 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652255 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652258 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652260 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652262 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652264 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652267 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652269 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652273 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652276 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652279 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652281 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652283 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652286 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:11:07.654763 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652288 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652291 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652293 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652296 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652298 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652301 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652303 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652306 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652308 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652310 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652313 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652315 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652317 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652319 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652322 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652324 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652327 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652329 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652331 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652334 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:11:07.655252 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652336 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652338 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652340 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652343 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652345 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652347 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652349 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652352 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652354 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652356 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652358 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652360 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652363 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652366 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652368 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652370 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:11:07.655727 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:07.652372 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:11:07.656098 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.652377 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:11:07.656098 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.653016 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 20:11:07.656763 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.656751 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 20:11:07.657555 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.657544 2578 server.go:1019] "Starting client certificate rotation" Apr 20 20:11:07.657655 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.657638 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:11:07.657694 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.657681 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:11:07.682923 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.682905 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:11:07.685750 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.685729 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:11:07.698479 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.698464 2578 log.go:25] "Validated CRI v1 runtime API" Apr 20 20:11:07.706056 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.706040 2578 log.go:25] "Validated CRI v1 image API" Apr 20 20:11:07.707762 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.707730 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 20:11:07.710161 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.710145 2578 fs.go:135] Filesystem UUIDs: map[199bab79-51ae-4e9b-8808-d45bb4446f5d:/dev/nvme0n1p3 3e39bbb5-a099-4ca2-b808-f84c2a650cc5:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 20 20:11:07.710208 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.710161 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 20:11:07.713572 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.713557 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:11:07.715728 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.715634 2578 manager.go:217] Machine: {Timestamp:2026-04-20 20:11:07.713757574 +0000 UTC m=+0.392024621 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3178354 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d3f3626e3b47edcafb396217a7bcf SystemUUID:ec2d3f36-26e3-b47e-dcaf-b396217a7bcf BootID:42aace2f-eddf-4179-8c5f-b92e68c8a368 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5c:e6:09:af:35 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5c:e6:09:af:35 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:52:34:8b:bc:04:b6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 20:11:07.715728 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.715724 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 20:11:07.715825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.715787 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 20:11:07.718344 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.718325 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 20:11:07.718462 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.718346 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-66.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 20:11:07.718504 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.718471 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 20:11:07.718504 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.718479 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 20:11:07.718504 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.718491 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:11:07.719228 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.719219 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:11:07.720374 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.720365 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:11:07.720465 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.720457 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 20:11:07.723374 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.723364 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 20 20:11:07.723416 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.723382 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 20:11:07.723416 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.723398 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 20:11:07.723416 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.723406 2578 kubelet.go:397] "Adding apiserver pod source" Apr 20 20:11:07.723416 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.723414 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 20:11:07.724422 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.724411 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:11:07.724466 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.724427 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:11:07.727416 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.727399 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 20:11:07.728651 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.728636 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 20:11:07.730499 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.730475 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 20:11:07.730499 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.730499 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 20:11:07.730645 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.730512 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 20:11:07.730645 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.730521 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 20:11:07.730645 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.730531 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 20:11:07.730645 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.730540 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 20:11:07.730645 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.730548 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 20:11:07.730645 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.730557 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 20:11:07.730645 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.730567 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 20:11:07.730645 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.730594 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 20:11:07.730645 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.730619 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 20:11:07.730645 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.730632 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 20:11:07.732624 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.732613 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 20:11:07.732686 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.732627 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 20:11:07.733271 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:07.733250 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 20:11:07.733271 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:07.733254 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-66.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 20:11:07.735862 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.735847 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 20:11:07.735943 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.735888 2578 server.go:1295] "Started kubelet" Apr 20 20:11:07.736007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.735964 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 20:11:07.736094 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.736043 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 20:11:07.736134 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.736119 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 20:11:07.736600 ip-10-0-134-66 systemd[1]: Started Kubernetes Kubelet. Apr 20 20:11:07.737028 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.737014 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 20:11:07.738754 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.738739 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 20 20:11:07.741803 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.741785 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 20:11:07.741803 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.741800 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 20:11:07.742938 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:07.742335 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 20 20:11:07.742938 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.742524 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 20:11:07.742938 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.742529 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 20:11:07.742938 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.742547 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 20:11:07.742938 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.742693 2578 factory.go:55] Registering systemd factory Apr 20 20:11:07.742938 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.742712 2578 factory.go:223] Registration of the systemd container factory successfully Apr 20 20:11:07.742938 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.742867 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 20 20:11:07.742938 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.742875 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 20 20:11:07.743232 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.742948 2578 factory.go:153] Registering CRI-O factory Apr 20 20:11:07.743232 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.742959 2578 factory.go:223] Registration of the crio container factory successfully Apr 20 20:11:07.743982 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.743951 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 20:11:07.744076 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.744003 2578 factory.go:103] Registering Raw factory Apr 20 20:11:07.744076 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.744022 2578 manager.go:1196] Started watching for new ooms in manager Apr 20 20:11:07.744888 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.744841 2578 manager.go:319] Starting recovery of all containers Apr 20 20:11:07.746685 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:07.746635 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 20:11:07.750883 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:07.750852 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-66.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 20:11:07.751292 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:07.751269 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 20:11:07.751381 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.751344 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-66.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 20:11:07.752024 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:07.750943 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-66.ec2.internal.18a829ab1332235b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-66.ec2.internal,UID:ip-10-0-134-66.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-66.ec2.internal,},FirstTimestamp:2026-04-20 20:11:07.735860059 +0000 UTC m=+0.414127107,LastTimestamp:2026-04-20 20:11:07.735860059 +0000 UTC m=+0.414127107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-66.ec2.internal,}" Apr 20 20:11:07.752443 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.752421 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9lmms" Apr 20 20:11:07.756331 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.756202 2578 manager.go:324] Recovery completed Apr 20 20:11:07.757876 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:07.757775 2578 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 20 20:11:07.758976 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.758958 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9lmms" Apr 20 20:11:07.760570 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.760555 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:11:07.762767 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.762754 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:11:07.762827 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.762779 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:11:07.762827 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.762791 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:11:07.763247 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.763235 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 20:11:07.763247 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.763245 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 20:11:07.763339 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.763279 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:11:07.764348 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:07.764295 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-66.ec2.internal.18a829ab14ccb103 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-66.ec2.internal,UID:ip-10-0-134-66.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-66.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-66.ec2.internal,},FirstTimestamp:2026-04-20 20:11:07.762766083 +0000 UTC m=+0.441033130,LastTimestamp:2026-04-20 20:11:07.762766083 +0000 UTC m=+0.441033130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-66.ec2.internal,}" Apr 20 20:11:07.765484 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.765473 2578 policy_none.go:49] "None policy: Start" Apr 20 20:11:07.765526 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.765489 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 20:11:07.765526 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.765498 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 20 20:11:07.792074 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.792061 2578 manager.go:341] "Starting Device Plugin manager" Apr 20 20:11:07.792144 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:07.792085 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 20:11:07.792144 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.792093 2578 server.go:85] "Starting device plugin registration server" Apr 20 20:11:07.792289 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.792276 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 20:11:07.792354 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.792294 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 20:11:07.792406 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.792363 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 20:11:07.792458 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.792451 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 20:11:07.792509 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.792460 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 20:11:07.793390 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:07.792898 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 20:11:07.793390 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:07.792930 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-66.ec2.internal\" not found" Apr 20 20:11:07.884617 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.884573 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 20:11:07.885775 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.885763 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 20:11:07.885832 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.885785 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 20:11:07.885832 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.885800 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 20:11:07.885832 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.885806 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 20:11:07.885932 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:07.885832 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 20:11:07.889656 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.889610 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:07.892504 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.892493 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:11:07.893190 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.893178 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:11:07.893257 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.893201 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:11:07.893257 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.893211 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:11:07.893257 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.893231 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-66.ec2.internal" Apr 20 20:11:07.899506 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.899492 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-66.ec2.internal" Apr 20 20:11:07.899564 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:07.899511 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-66.ec2.internal\": node \"ip-10-0-134-66.ec2.internal\" not found" Apr 20 20:11:07.918402 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:07.918382 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 20 20:11:07.986827 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.986807 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal"] Apr 20 20:11:07.986874 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.986866 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:11:07.988692 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.988678 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:11:07.988751 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.988702 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:11:07.988751 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.988716 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:11:07.989837 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.989826 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:11:07.990018 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.990005 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" Apr 20 20:11:07.990055 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.990030 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:11:07.990546 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.990520 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:11:07.990627 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.990557 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:11:07.990627 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.990566 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:11:07.990627 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.990573 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:11:07.990627 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.990605 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:11:07.990627 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.990614 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:11:07.991838 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.991820 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 20 20:11:07.991928 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.991854 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:11:07.993340 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.993325 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:11:07.993421 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.993354 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:11:07.993421 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:07.993367 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:11:08.015150 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:08.015126 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-66.ec2.internal\" not found" node="ip-10-0-134-66.ec2.internal" Apr 20 20:11:08.019022 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:08.019008 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 20 20:11:08.019496 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:08.019483 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-66.ec2.internal\" not found" node="ip-10-0-134-66.ec2.internal" Apr 20 20:11:08.044039 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.044023 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2993d233f9a00d06d5f483ee8282a6f2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal\" (UID: \"2993d233f9a00d06d5f483ee8282a6f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 20 20:11:08.044090 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.044045 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/19015d708a7f5256313024ebc4553800-config\") pod \"kube-apiserver-proxy-ip-10-0-134-66.ec2.internal\" (UID: \"19015d708a7f5256313024ebc4553800\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" Apr 20 20:11:08.044090 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.044062 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2993d233f9a00d06d5f483ee8282a6f2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal\" (UID: \"2993d233f9a00d06d5f483ee8282a6f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 20 20:11:08.119769 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:08.119750 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 20 20:11:08.145101 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.145066 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/19015d708a7f5256313024ebc4553800-config\") pod \"kube-apiserver-proxy-ip-10-0-134-66.ec2.internal\" (UID: \"19015d708a7f5256313024ebc4553800\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" Apr 20 20:11:08.145101 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.145093 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/19015d708a7f5256313024ebc4553800-config\") pod \"kube-apiserver-proxy-ip-10-0-134-66.ec2.internal\" (UID: \"19015d708a7f5256313024ebc4553800\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" Apr 20 20:11:08.145176 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.145098 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2993d233f9a00d06d5f483ee8282a6f2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal\" (UID: \"2993d233f9a00d06d5f483ee8282a6f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 20 20:11:08.145176 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.145125 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2993d233f9a00d06d5f483ee8282a6f2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal\" (UID: \"2993d233f9a00d06d5f483ee8282a6f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 20 20:11:08.145176 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.145134 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2993d233f9a00d06d5f483ee8282a6f2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal\" (UID: \"2993d233f9a00d06d5f483ee8282a6f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 20 20:11:08.145176 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.145153 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2993d233f9a00d06d5f483ee8282a6f2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal\" (UID: \"2993d233f9a00d06d5f483ee8282a6f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 20 20:11:08.220330 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:08.220313 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 20 20:11:08.316956 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.316934 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" Apr 20 20:11:08.321287 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:08.321274 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 20 20:11:08.322377 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.322363 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 20 20:11:08.422242 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:08.422194 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 20 20:11:08.522747 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:08.522725 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 20 20:11:08.623305 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:08.623286 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 20 20:11:08.657815 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.657796 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 20:11:08.658232 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.657900 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:11:08.724062 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:08.724019 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 20 20:11:08.742348 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.742333 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 20:11:08.760634 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.760611 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 20:06:07 +0000 UTC" deadline="2027-10-23 18:12:26.596453283 +0000 UTC" Apr 20 20:11:08.760709 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.760634 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13222h1m17.835822714s" Apr 20 20:11:08.765414 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.765398 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:11:08.783465 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.783449 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-69dwr" Apr 20 20:11:08.791941 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.791925 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-69dwr" Apr 20 20:11:08.792515 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.792503 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:08.824542 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:08.824524 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 20 20:11:08.925006 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:08.924981 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-66.ec2.internal\" not found" Apr 20 20:11:08.931361 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:08.931336 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19015d708a7f5256313024ebc4553800.slice/crio-9920bfd09c8720532f1b2db77b9d68233e96ddc6e7fab763a191a064fb1d28c1 WatchSource:0}: Error finding container 9920bfd09c8720532f1b2db77b9d68233e96ddc6e7fab763a191a064fb1d28c1: Status 404 returned error can't find the container with id 9920bfd09c8720532f1b2db77b9d68233e96ddc6e7fab763a191a064fb1d28c1 Apr 20 20:11:08.931830 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:08.931808 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2993d233f9a00d06d5f483ee8282a6f2.slice/crio-33b70bbab7af01fc36607b6890eac92b357ae8534e8e6bf3ec08d7790be49331 WatchSource:0}: Error finding container 33b70bbab7af01fc36607b6890eac92b357ae8534e8e6bf3ec08d7790be49331: Status 404 returned error can't find the container with id 33b70bbab7af01fc36607b6890eac92b357ae8534e8e6bf3ec08d7790be49331 Apr 20 20:11:08.936066 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.936049 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:11:08.988517 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:08.988473 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:09.030398 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.030378 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:09.042464 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.042446 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" Apr 20 20:11:09.052619 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.052602 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:11:09.053653 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.053642 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" Apr 20 20:11:09.062429 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.062411 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:11:09.467449 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.467408 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:11:09.725035 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.724973 2578 apiserver.go:52] "Watching apiserver" Apr 20 20:11:09.735426 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.735228 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 20:11:09.736136 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.736116 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-qdxkm","openshift-network-diagnostics/network-check-target-g75rf","openshift-network-operator/iptables-alerter-dnglx","kube-system/global-pull-secret-syncer-2cxr4","kube-system/konnectivity-agent-l9zp2","kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal","openshift-dns/node-resolver-nd6kq","openshift-image-registry/node-ca-xq679","openshift-multus/network-metrics-daemon-zc8kt","openshift-ovn-kubernetes/ovnkube-node-zg9wv","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf","openshift-cluster-node-tuning-operator/tuned-qg8wq","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal","openshift-multus/multus-9fj49"] Apr 20 20:11:09.739280 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.739217 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l9zp2" Apr 20 20:11:09.740258 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.740235 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.740350 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.740309 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nd6kq" Apr 20 20:11:09.741365 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.741346 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:09.741457 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:09.741416 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:09.742034 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.741759 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-sbtll\"" Apr 20 20:11:09.742034 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.741771 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 20:11:09.742034 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.741779 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 20:11:09.742513 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.742497 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 20:11:09.742664 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.742499 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 20:11:09.742769 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.742725 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 20:11:09.742833 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.742823 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 20:11:09.743340 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.743132 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 20:11:09.743340 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.743144 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xg57g\"" Apr 20 20:11:09.743340 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.743181 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-k2xsp\"" Apr 20 20:11:09.744699 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.744672 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dnglx" Apr 20 20:11:09.746810 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.746790 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xq679" Apr 20 20:11:09.748121 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.748097 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 20:11:09.748325 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.748308 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sn7f8\"" Apr 20 20:11:09.748756 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.748664 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 20:11:09.749171 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.748969 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:11:09.749250 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.749129 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 20:11:09.749306 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.749269 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.750164 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.750141 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:09.750419 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:09.750397 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:09.751254 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.751233 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-x9bmk\"" Apr 20 20:11:09.751442 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.751426 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 20:11:09.751442 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.751436 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 20:11:09.751725 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.751707 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 20:11:09.751725 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.751720 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 20:11:09.752448 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.752430 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.752650 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.752630 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-c9c95\"" Apr 20 20:11:09.752972 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.752958 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 20:11:09.753121 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.753110 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 20:11:09.753121 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.753116 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 20:11:09.753835 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.753819 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ed990d2c-a5ce-4f8a-b80f-44ae67582cd8-iptables-alerter-script\") pod \"iptables-alerter-dnglx\" (UID: \"ed990d2c-a5ce-4f8a-b80f-44ae67582cd8\") " pod="openshift-network-operator/iptables-alerter-dnglx" Apr 20 20:11:09.753923 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.753846 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/47af8f64-fc60-48b5-84b3-4c3888a04afa-konnectivity-ca\") pod \"konnectivity-agent-l9zp2\" (UID: \"47af8f64-fc60-48b5-84b3-4c3888a04afa\") " pod="kube-system/konnectivity-agent-l9zp2" Apr 20 20:11:09.753923 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.753871 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-kubelet-dir\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.754025 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.753941 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-device-dir\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.754025 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754006 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dljjn\" (UniqueName: \"kubernetes.io/projected/35c64ee5-3248-42b5-92a6-7ceb4aaad134-kube-api-access-dljjn\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.754126 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754033 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3246feba-bd0f-4518-845a-c9025f63125b-tmp-dir\") pod \"node-resolver-nd6kq\" (UID: \"3246feba-bd0f-4518-845a-c9025f63125b\") " pod="openshift-dns/node-resolver-nd6kq" Apr 20 20:11:09.754126 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754048 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrtxw\" (UniqueName: \"kubernetes.io/projected/3246feba-bd0f-4518-845a-c9025f63125b-kube-api-access-rrtxw\") pod \"node-resolver-nd6kq\" (UID: \"3246feba-bd0f-4518-845a-c9025f63125b\") " pod="openshift-dns/node-resolver-nd6kq" Apr 20 20:11:09.754126 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754085 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/06fca970-a9c2-455e-87f0-219af27b7cfd-kubelet-config\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:09.754126 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754114 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed990d2c-a5ce-4f8a-b80f-44ae67582cd8-host-slash\") pod \"iptables-alerter-dnglx\" (UID: \"ed990d2c-a5ce-4f8a-b80f-44ae67582cd8\") " pod="openshift-network-operator/iptables-alerter-dnglx" Apr 20 20:11:09.754280 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754141 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-socket-dir\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.754280 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754165 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-sys-fs\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.754280 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/06fca970-a9c2-455e-87f0-219af27b7cfd-dbus\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:09.754280 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754210 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/47af8f64-fc60-48b5-84b3-4c3888a04afa-agent-certs\") pod \"konnectivity-agent-l9zp2\" (UID: \"47af8f64-fc60-48b5-84b3-4c3888a04afa\") " pod="kube-system/konnectivity-agent-l9zp2" Apr 20 20:11:09.754280 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754233 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:09.754280 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfmv7\" (UniqueName: \"kubernetes.io/projected/ed990d2c-a5ce-4f8a-b80f-44ae67582cd8-kube-api-access-lfmv7\") pod \"iptables-alerter-dnglx\" (UID: \"ed990d2c-a5ce-4f8a-b80f-44ae67582cd8\") " pod="openshift-network-operator/iptables-alerter-dnglx" Apr 20 20:11:09.754465 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754289 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-registration-dir\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.754465 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754322 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-etc-selinux\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.754465 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754351 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3246feba-bd0f-4518-845a-c9025f63125b-hosts-file\") pod \"node-resolver-nd6kq\" (UID: \"3246feba-bd0f-4518-845a-c9025f63125b\") " pod="openshift-dns/node-resolver-nd6kq" Apr 20 20:11:09.754957 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754938 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 20:11:09.755059 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.754974 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 20:11:09.755059 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.755004 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 20:11:09.755182 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.755162 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 20:11:09.756400 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.756330 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:09.756624 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:09.756398 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:09.756624 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.756548 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.757102 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.757037 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2gbkm\"" Apr 20 20:11:09.757102 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.757050 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 20:11:09.757102 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.757039 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 20:11:09.758169 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.758135 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.758798 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.758778 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:11:09.759118 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.759098 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qr2n8\"" Apr 20 20:11:09.759526 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.759431 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 20:11:09.760633 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.760618 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 20:11:09.760707 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.760623 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-n495z\"" Apr 20 20:11:09.794538 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.794513 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:06:08 +0000 UTC" deadline="2027-10-23 15:31:50.926378715 +0000 UTC" Apr 20 20:11:09.794538 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.794539 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13219h20m41.13184333s" Apr 20 20:11:09.843645 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.843614 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 20:11:09.857751 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.857728 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-run-ovn\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.857854 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.857767 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5e335ef-aa11-4058-9448-d253b1a467c3-host\") pod \"node-ca-xq679\" (UID: \"b5e335ef-aa11-4058-9448-d253b1a467c3\") " pod="openshift-image-registry/node-ca-xq679" Apr 20 20:11:09.857854 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.857797 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-cnibin\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.857994 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.857845 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-var-lib-cni-multus\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.857994 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.857925 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbp9q\" (UniqueName: \"kubernetes.io/projected/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-kube-api-access-pbp9q\") pod \"network-metrics-daemon-zc8kt\" (UID: \"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d\") " pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:09.858095 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.857990 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2fn7\" (UniqueName: \"kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7\") pod \"network-check-target-g75rf\" (UID: \"c5e48953-ddd6-4f02-b2d7-7a42dede535f\") " pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:09.858095 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858028 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-lib-modules\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.858095 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3246feba-bd0f-4518-845a-c9025f63125b-tmp-dir\") pod \"node-resolver-nd6kq\" (UID: \"3246feba-bd0f-4518-845a-c9025f63125b\") " pod="openshift-dns/node-resolver-nd6kq" Apr 20 20:11:09.858237 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858096 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/06fca970-a9c2-455e-87f0-219af27b7cfd-kubelet-config\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:09.858237 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-kubelet\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.858237 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858161 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-modprobe-d\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.858237 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858187 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-run-netns\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.858237 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858218 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/06fca970-a9c2-455e-87f0-219af27b7cfd-dbus\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:09.858461 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2ff55742-6058-4a56-941a-aec639653897-cni-binary-copy\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.858461 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858285 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-systemd-units\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.858461 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858320 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-sysctl-conf\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.858461 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858346 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-os-release\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.858461 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858415 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3246feba-bd0f-4518-845a-c9025f63125b-tmp-dir\") pod \"node-resolver-nd6kq\" (UID: \"3246feba-bd0f-4518-845a-c9025f63125b\") " pod="openshift-dns/node-resolver-nd6kq" Apr 20 20:11:09.858461 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858447 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/06fca970-a9c2-455e-87f0-219af27b7cfd-kubelet-config\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:09.858757 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858479 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:09.858757 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858517 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfmv7\" (UniqueName: \"kubernetes.io/projected/ed990d2c-a5ce-4f8a-b80f-44ae67582cd8-kube-api-access-lfmv7\") pod \"iptables-alerter-dnglx\" (UID: \"ed990d2c-a5ce-4f8a-b80f-44ae67582cd8\") " pod="openshift-network-operator/iptables-alerter-dnglx" Apr 20 20:11:09.858757 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858567 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2ff55742-6058-4a56-941a-aec639653897-system-cni-dir\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.858757 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858620 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs\") pod \"network-metrics-daemon-zc8kt\" (UID: \"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d\") " pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:09.858757 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858655 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-slash\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.858757 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858687 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4d7257b5-0312-4d95-b84e-bf7589326417-env-overrides\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.858757 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858737 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-sys\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.859082 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-var-lib-cni-bin\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.859082 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:09.858791 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:09.859082 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858809 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3246feba-bd0f-4518-845a-c9025f63125b-hosts-file\") pod \"node-resolver-nd6kq\" (UID: \"3246feba-bd0f-4518-845a-c9025f63125b\") " pod="openshift-dns/node-resolver-nd6kq" Apr 20 20:11:09.859082 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858841 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-node-log\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.859082 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.858923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3246feba-bd0f-4518-845a-c9025f63125b-hosts-file\") pod \"node-resolver-nd6kq\" (UID: \"3246feba-bd0f-4518-845a-c9025f63125b\") " pod="openshift-dns/node-resolver-nd6kq" Apr 20 20:11:09.859310 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859109 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/06fca970-a9c2-455e-87f0-219af27b7cfd-dbus\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:09.859310 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:09.859126 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret podName:06fca970-a9c2-455e-87f0-219af27b7cfd nodeName:}" failed. No retries permitted until 2026-04-20 20:11:10.358859776 +0000 UTC m=+3.037126828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret") pod "global-pull-secret-syncer-2cxr4" (UID: "06fca970-a9c2-455e-87f0-219af27b7cfd") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:09.859310 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859159 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.859310 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859195 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-run-k8s-cni-cncf-io\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.859310 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859246 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/188b7548-da8a-4c62-b93e-536b9b6e09ea-multus-daemon-config\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.859310 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859283 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-device-dir\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.859531 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859339 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-device-dir\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.859727 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859706 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dljjn\" (UniqueName: \"kubernetes.io/projected/35c64ee5-3248-42b5-92a6-7ceb4aaad134-kube-api-access-dljjn\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.859804 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859746 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrtxw\" (UniqueName: \"kubernetes.io/projected/3246feba-bd0f-4518-845a-c9025f63125b-kube-api-access-rrtxw\") pod \"node-resolver-nd6kq\" (UID: \"3246feba-bd0f-4518-845a-c9025f63125b\") " pod="openshift-dns/node-resolver-nd6kq" Apr 20 20:11:09.859804 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859775 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-etc-openvswitch\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.859895 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859800 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4d7257b5-0312-4d95-b84e-bf7589326417-ovn-node-metrics-cert\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.859895 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859825 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-tuned\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.859895 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859874 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-var-lib-kubelet\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.860029 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859900 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-socket-dir\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.860029 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859926 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-var-lib-openvswitch\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.860029 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859948 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-log-socket\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.860029 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859971 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbmft\" (UniqueName: \"kubernetes.io/projected/4d7257b5-0312-4d95-b84e-bf7589326417-kube-api-access-wbmft\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.860029 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.859993 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/188b7548-da8a-4c62-b93e-536b9b6e09ea-cni-binary-copy\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.860029 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860011 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/47af8f64-fc60-48b5-84b3-4c3888a04afa-agent-certs\") pod \"konnectivity-agent-l9zp2\" (UID: \"47af8f64-fc60-48b5-84b3-4c3888a04afa\") " pod="kube-system/konnectivity-agent-l9zp2" Apr 20 20:11:09.860029 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860025 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2ff55742-6058-4a56-941a-aec639653897-os-release\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860041 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2ff55742-6058-4a56-941a-aec639653897-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860055 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b5e335ef-aa11-4058-9448-d253b1a467c3-serviceca\") pod \"node-ca-xq679\" (UID: \"b5e335ef-aa11-4058-9448-d253b1a467c3\") " pod="openshift-image-registry/node-ca-xq679" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860073 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-run\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860099 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-system-cni-dir\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860113 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5pv8\" (UniqueName: \"kubernetes.io/projected/188b7548-da8a-4c62-b93e-536b9b6e09ea-kube-api-access-q5pv8\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860128 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ed990d2c-a5ce-4f8a-b80f-44ae67582cd8-iptables-alerter-script\") pod \"iptables-alerter-dnglx\" (UID: \"ed990d2c-a5ce-4f8a-b80f-44ae67582cd8\") " pod="openshift-network-operator/iptables-alerter-dnglx" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860143 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-cni-bin\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860155 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-kubernetes\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860168 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdssn\" (UniqueName: \"kubernetes.io/projected/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-kube-api-access-bdssn\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860182 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/47af8f64-fc60-48b5-84b3-4c3888a04afa-konnectivity-ca\") pod \"konnectivity-agent-l9zp2\" (UID: \"47af8f64-fc60-48b5-84b3-4c3888a04afa\") " pod="kube-system/konnectivity-agent-l9zp2" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860196 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed990d2c-a5ce-4f8a-b80f-44ae67582cd8-host-slash\") pod \"iptables-alerter-dnglx\" (UID: \"ed990d2c-a5ce-4f8a-b80f-44ae67582cd8\") " pod="openshift-network-operator/iptables-alerter-dnglx" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-run-netns\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860238 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-multus-cni-dir\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-multus-conf-dir\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860266 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2ff55742-6058-4a56-941a-aec639653897-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860280 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-var-lib-kubelet\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.860327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860292 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-tmp\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.861056 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860308 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2ff55742-6058-4a56-941a-aec639653897-cnibin\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.861056 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860332 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2ff55742-6058-4a56-941a-aec639653897-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.861056 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860370 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-registration-dir\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.861056 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860426 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-registration-dir\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.861056 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860553 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-socket-dir\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.861056 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.860650 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 20:11:09.861056 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ed990d2c-a5ce-4f8a-b80f-44ae67582cd8-iptables-alerter-script\") pod \"iptables-alerter-dnglx\" (UID: \"ed990d2c-a5ce-4f8a-b80f-44ae67582cd8\") " pod="openshift-network-operator/iptables-alerter-dnglx" Apr 20 20:11:09.861323 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861299 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed990d2c-a5ce-4f8a-b80f-44ae67582cd8-host-slash\") pod \"iptables-alerter-dnglx\" (UID: \"ed990d2c-a5ce-4f8a-b80f-44ae67582cd8\") " pod="openshift-network-operator/iptables-alerter-dnglx" Apr 20 20:11:09.861390 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861375 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-etc-selinux\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.861437 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861411 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47p59\" (UniqueName: \"kubernetes.io/projected/2ff55742-6058-4a56-941a-aec639653897-kube-api-access-47p59\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.861485 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861435 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-run-systemd\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.861485 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/47af8f64-fc60-48b5-84b3-4c3888a04afa-konnectivity-ca\") pod \"konnectivity-agent-l9zp2\" (UID: \"47af8f64-fc60-48b5-84b3-4c3888a04afa\") " pod="kube-system/konnectivity-agent-l9zp2" Apr 20 20:11:09.861485 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861466 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-etc-selinux\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.861485 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861458 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-run-openvswitch\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.861681 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861522 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-run-ovn-kubernetes\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.861681 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861549 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4d7257b5-0312-4d95-b84e-bf7589326417-ovnkube-config\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.861681 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861615 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-kubelet-dir\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.861765 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861716 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-kubelet-dir\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.861815 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861778 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-cni-netd\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.861815 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861800 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4d7257b5-0312-4d95-b84e-bf7589326417-ovnkube-script-lib\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.861912 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861820 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxgvl\" (UniqueName: \"kubernetes.io/projected/b5e335ef-aa11-4058-9448-d253b1a467c3-kube-api-access-dxgvl\") pod \"node-ca-xq679\" (UID: \"b5e335ef-aa11-4058-9448-d253b1a467c3\") " pod="openshift-image-registry/node-ca-xq679" Apr 20 20:11:09.861912 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861840 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-sysconfig\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.861912 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861857 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-systemd\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.861912 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861877 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-multus-socket-dir-parent\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.861912 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-run-multus-certs\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.862141 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861931 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-sys-fs\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.862141 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861969 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-sysctl-d\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.862141 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.861989 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-host\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.862141 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.862008 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-hostroot\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.862141 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.862027 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-etc-kubernetes\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.862141 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.862137 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/35c64ee5-3248-42b5-92a6-7ceb4aaad134-sys-fs\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.864452 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.864434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/47af8f64-fc60-48b5-84b3-4c3888a04afa-agent-certs\") pod \"konnectivity-agent-l9zp2\" (UID: \"47af8f64-fc60-48b5-84b3-4c3888a04afa\") " pod="kube-system/konnectivity-agent-l9zp2" Apr 20 20:11:09.868453 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.868432 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dljjn\" (UniqueName: \"kubernetes.io/projected/35c64ee5-3248-42b5-92a6-7ceb4aaad134-kube-api-access-dljjn\") pod \"aws-ebs-csi-driver-node-knjxf\" (UID: \"35c64ee5-3248-42b5-92a6-7ceb4aaad134\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:09.869011 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.868990 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrtxw\" (UniqueName: \"kubernetes.io/projected/3246feba-bd0f-4518-845a-c9025f63125b-kube-api-access-rrtxw\") pod \"node-resolver-nd6kq\" (UID: \"3246feba-bd0f-4518-845a-c9025f63125b\") " pod="openshift-dns/node-resolver-nd6kq" Apr 20 20:11:09.872173 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.872150 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfmv7\" (UniqueName: \"kubernetes.io/projected/ed990d2c-a5ce-4f8a-b80f-44ae67582cd8-kube-api-access-lfmv7\") pod \"iptables-alerter-dnglx\" (UID: \"ed990d2c-a5ce-4f8a-b80f-44ae67582cd8\") " pod="openshift-network-operator/iptables-alerter-dnglx" Apr 20 20:11:09.892309 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.892266 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" event={"ID":"2993d233f9a00d06d5f483ee8282a6f2","Type":"ContainerStarted","Data":"33b70bbab7af01fc36607b6890eac92b357ae8534e8e6bf3ec08d7790be49331"} Apr 20 20:11:09.893641 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.893617 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" event={"ID":"19015d708a7f5256313024ebc4553800","Type":"ContainerStarted","Data":"9920bfd09c8720532f1b2db77b9d68233e96ddc6e7fab763a191a064fb1d28c1"} Apr 20 20:11:09.963235 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963210 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-kubelet\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.963235 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-modprobe-d\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.963364 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963265 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-run-netns\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.963364 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963292 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2ff55742-6058-4a56-941a-aec639653897-cni-binary-copy\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.963364 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963314 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-systemd-units\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.963364 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-sysctl-conf\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.963364 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963343 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-os-release\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.963546 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963375 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2ff55742-6058-4a56-941a-aec639653897-system-cni-dir\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.963546 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs\") pod \"network-metrics-daemon-zc8kt\" (UID: \"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d\") " pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:09.963546 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-slash\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.963546 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963417 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4d7257b5-0312-4d95-b84e-bf7589326417-env-overrides\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.963546 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963430 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-sys\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.963546 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963443 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-var-lib-cni-bin\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.963546 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963458 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-node-log\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.963546 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963484 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.963546 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-run-k8s-cni-cncf-io\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.963546 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963526 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/188b7548-da8a-4c62-b93e-536b9b6e09ea-multus-daemon-config\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.963546 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963543 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-etc-openvswitch\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4d7257b5-0312-4d95-b84e-bf7589326417-ovn-node-metrics-cert\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-tuned\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963611 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-var-lib-kubelet\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963632 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-var-lib-openvswitch\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963650 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-log-socket\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963674 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbmft\" (UniqueName: \"kubernetes.io/projected/4d7257b5-0312-4d95-b84e-bf7589326417-kube-api-access-wbmft\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963700 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/188b7548-da8a-4c62-b93e-536b9b6e09ea-cni-binary-copy\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963727 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2ff55742-6058-4a56-941a-aec639653897-os-release\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963751 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2ff55742-6058-4a56-941a-aec639653897-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963770 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b5e335ef-aa11-4058-9448-d253b1a467c3-serviceca\") pod \"node-ca-xq679\" (UID: \"b5e335ef-aa11-4058-9448-d253b1a467c3\") " pod="openshift-image-registry/node-ca-xq679" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963806 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-run\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963824 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-system-cni-dir\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963838 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5pv8\" (UniqueName: \"kubernetes.io/projected/188b7548-da8a-4c62-b93e-536b9b6e09ea-kube-api-access-q5pv8\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963857 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-cni-bin\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963882 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-kubernetes\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963904 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdssn\" (UniqueName: \"kubernetes.io/projected/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-kube-api-access-bdssn\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.964007 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963923 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-run-netns\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963949 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-multus-cni-dir\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-multus-conf-dir\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.963995 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2ff55742-6058-4a56-941a-aec639653897-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964011 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-var-lib-kubelet\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964025 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-tmp\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964040 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2ff55742-6058-4a56-941a-aec639653897-cnibin\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2ff55742-6058-4a56-941a-aec639653897-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964084 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47p59\" (UniqueName: \"kubernetes.io/projected/2ff55742-6058-4a56-941a-aec639653897-kube-api-access-47p59\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964104 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-run-systemd\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964124 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-run-openvswitch\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964138 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-run-ovn-kubernetes\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964154 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4d7257b5-0312-4d95-b84e-bf7589326417-ovnkube-config\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-cni-netd\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964202 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4d7257b5-0312-4d95-b84e-bf7589326417-ovnkube-script-lib\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964228 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxgvl\" (UniqueName: \"kubernetes.io/projected/b5e335ef-aa11-4058-9448-d253b1a467c3-kube-api-access-dxgvl\") pod \"node-ca-xq679\" (UID: \"b5e335ef-aa11-4058-9448-d253b1a467c3\") " pod="openshift-image-registry/node-ca-xq679" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-sysconfig\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.964743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964258 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-systemd\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964273 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-multus-socket-dir-parent\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964296 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-run-multus-certs\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964337 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-sysctl-d\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964358 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-host\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964389 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-hostroot\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964415 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-etc-kubernetes\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964438 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-run-ovn\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964465 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5e335ef-aa11-4058-9448-d253b1a467c3-host\") pod \"node-ca-xq679\" (UID: \"b5e335ef-aa11-4058-9448-d253b1a467c3\") " pod="openshift-image-registry/node-ca-xq679" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964489 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-cnibin\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964514 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-var-lib-cni-multus\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbp9q\" (UniqueName: \"kubernetes.io/projected/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-kube-api-access-pbp9q\") pod \"network-metrics-daemon-zc8kt\" (UID: \"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d\") " pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964566 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fn7\" (UniqueName: \"kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7\") pod \"network-check-target-g75rf\" (UID: \"c5e48953-ddd6-4f02-b2d7-7a42dede535f\") " pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964602 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-lib-modules\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964760 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-lib-modules\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964792 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-kubelet\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964849 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-modprobe-d\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.964886 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-run-netns\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.965481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.965407 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2ff55742-6058-4a56-941a-aec639653897-cni-binary-copy\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.966300 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.965470 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-run-netns\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.966300 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.965527 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-systemd-units\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.966300 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.965530 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-multus-cni-dir\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.966300 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.965562 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-multus-conf-dir\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.966300 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.965674 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-sysctl-conf\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.966300 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.965728 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-os-release\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.966300 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.965765 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2ff55742-6058-4a56-941a-aec639653897-system-cni-dir\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.966300 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:09.965835 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:09.966300 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:09.965894 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs podName:a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d nodeName:}" failed. No retries permitted until 2026-04-20 20:11:10.465877436 +0000 UTC m=+3.144144484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs") pod "network-metrics-daemon-zc8kt" (UID: "a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:09.966300 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966029 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2ff55742-6058-4a56-941a-aec639653897-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.966300 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966094 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-slash\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.966300 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966139 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-var-lib-kubelet\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966380 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-systemd\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966426 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2ff55742-6058-4a56-941a-aec639653897-cnibin\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966426 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-host\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-hostroot\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966470 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-sysctl-d\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966505 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-sys\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-log-socket\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966531 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-var-lib-cni-bin\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966547 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2ff55742-6058-4a56-941a-aec639653897-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966559 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-node-log\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966558 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-multus-socket-dir-parent\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966613 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966623 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-run-systemd\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966628 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-run-multus-certs\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-run-k8s-cni-cncf-io\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966668 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-run-openvswitch\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.966850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966676 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-cni-netd\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.967593 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966946 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-run\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.967593 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.967109 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4d7257b5-0312-4d95-b84e-bf7589326417-env-overrides\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.967593 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.967176 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-system-cni-dir\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.967593 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.967282 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-cnibin\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.967593 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.967332 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-run-ovn\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.967593 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.967291 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-kubernetes\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.967593 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.967415 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-sysconfig\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.967593 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.967476 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-var-lib-kubelet\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.967593 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.967519 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-host-var-lib-cni-multus\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.968020 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.966672 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-etc-openvswitch\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.968020 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.967763 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-cni-bin\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.968020 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.967373 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5e335ef-aa11-4058-9448-d253b1a467c3-host\") pod \"node-ca-xq679\" (UID: \"b5e335ef-aa11-4058-9448-d253b1a467c3\") " pod="openshift-image-registry/node-ca-xq679" Apr 20 20:11:09.968020 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.967848 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/188b7548-da8a-4c62-b93e-536b9b6e09ea-etc-kubernetes\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.968020 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.967940 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2ff55742-6058-4a56-941a-aec639653897-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.968020 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.968002 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-var-lib-openvswitch\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.968283 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.968058 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2ff55742-6058-4a56-941a-aec639653897-os-release\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.968334 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.968314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4d7257b5-0312-4d95-b84e-bf7589326417-ovnkube-script-lib\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.968429 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.968375 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d7257b5-0312-4d95-b84e-bf7589326417-host-run-ovn-kubernetes\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.968429 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.968351 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/188b7548-da8a-4c62-b93e-536b9b6e09ea-multus-daemon-config\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.968429 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.968407 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b5e335ef-aa11-4058-9448-d253b1a467c3-serviceca\") pod \"node-ca-xq679\" (UID: \"b5e335ef-aa11-4058-9448-d253b1a467c3\") " pod="openshift-image-registry/node-ca-xq679" Apr 20 20:11:09.968937 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.968916 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/188b7548-da8a-4c62-b93e-536b9b6e09ea-cni-binary-copy\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:09.969267 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.969246 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4d7257b5-0312-4d95-b84e-bf7589326417-ovnkube-config\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.969328 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.969250 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-etc-tuned\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.969328 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.969302 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-tmp\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.971404 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.971383 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4d7257b5-0312-4d95-b84e-bf7589326417-ovn-node-metrics-cert\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.973658 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:09.973638 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:09.973746 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:09.973661 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:09.973746 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:09.973674 2578 projected.go:194] Error preparing data for projected volume kube-api-access-r2fn7 for pod openshift-network-diagnostics/network-check-target-g75rf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:09.973858 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:09.973771 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7 podName:c5e48953-ddd6-4f02-b2d7-7a42dede535f nodeName:}" failed. No retries permitted until 2026-04-20 20:11:10.473754526 +0000 UTC m=+3.152021584 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-r2fn7" (UniqueName: "kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7") pod "network-check-target-g75rf" (UID: "c5e48953-ddd6-4f02-b2d7-7a42dede535f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:09.975192 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.975122 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdssn\" (UniqueName: \"kubernetes.io/projected/e9a81a97-edd6-47c8-a5ab-611a4f60d22a-kube-api-access-bdssn\") pod \"tuned-qg8wq\" (UID: \"e9a81a97-edd6-47c8-a5ab-611a4f60d22a\") " pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:09.976958 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.976512 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbmft\" (UniqueName: \"kubernetes.io/projected/4d7257b5-0312-4d95-b84e-bf7589326417-kube-api-access-wbmft\") pod \"ovnkube-node-zg9wv\" (UID: \"4d7257b5-0312-4d95-b84e-bf7589326417\") " pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:09.977252 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.977227 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbp9q\" (UniqueName: \"kubernetes.io/projected/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-kube-api-access-pbp9q\") pod \"network-metrics-daemon-zc8kt\" (UID: \"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d\") " pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:09.977747 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.977728 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxgvl\" (UniqueName: \"kubernetes.io/projected/b5e335ef-aa11-4058-9448-d253b1a467c3-kube-api-access-dxgvl\") pod \"node-ca-xq679\" (UID: \"b5e335ef-aa11-4058-9448-d253b1a467c3\") " pod="openshift-image-registry/node-ca-xq679" Apr 20 20:11:09.977835 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.977798 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47p59\" (UniqueName: \"kubernetes.io/projected/2ff55742-6058-4a56-941a-aec639653897-kube-api-access-47p59\") pod \"multus-additional-cni-plugins-qdxkm\" (UID: \"2ff55742-6058-4a56-941a-aec639653897\") " pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:09.978732 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:09.978712 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5pv8\" (UniqueName: \"kubernetes.io/projected/188b7548-da8a-4c62-b93e-536b9b6e09ea-kube-api-access-q5pv8\") pod \"multus-9fj49\" (UID: \"188b7548-da8a-4c62-b93e-536b9b6e09ea\") " pod="openshift-multus/multus-9fj49" Apr 20 20:11:10.053259 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.053233 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l9zp2" Apr 20 20:11:10.061014 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.060990 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" Apr 20 20:11:10.069650 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.069634 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nd6kq" Apr 20 20:11:10.076235 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.076217 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dnglx" Apr 20 20:11:10.083769 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.083755 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xq679" Apr 20 20:11:10.091291 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.091272 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qdxkm" Apr 20 20:11:10.098891 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.098873 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:10.112439 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.112413 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" Apr 20 20:11:10.125000 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.124981 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9fj49" Apr 20 20:11:10.368235 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.368166 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:10.368376 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:10.368321 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:10.368434 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:10.368392 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret podName:06fca970-a9c2-455e-87f0-219af27b7cfd nodeName:}" failed. No retries permitted until 2026-04-20 20:11:11.368376431 +0000 UTC m=+4.046643469 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret") pod "global-pull-secret-syncer-2cxr4" (UID: "06fca970-a9c2-455e-87f0-219af27b7cfd") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:10.469324 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.469291 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs\") pod \"network-metrics-daemon-zc8kt\" (UID: \"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d\") " pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:10.469466 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:10.469429 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:10.469523 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:10.469484 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs podName:a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d nodeName:}" failed. No retries permitted until 2026-04-20 20:11:11.469470173 +0000 UTC m=+4.147737208 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs") pod "network-metrics-daemon-zc8kt" (UID: "a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:10.570091 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.570060 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fn7\" (UniqueName: \"kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7\") pod \"network-check-target-g75rf\" (UID: \"c5e48953-ddd6-4f02-b2d7-7a42dede535f\") " pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:10.570250 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:10.570226 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:10.570250 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:10.570247 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:10.570362 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:10.570259 2578 projected.go:194] Error preparing data for projected volume kube-api-access-r2fn7 for pod openshift-network-diagnostics/network-check-target-g75rf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:10.570362 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:10.570319 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7 podName:c5e48953-ddd6-4f02-b2d7-7a42dede535f nodeName:}" failed. No retries permitted until 2026-04-20 20:11:11.570300259 +0000 UTC m=+4.248567296 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-r2fn7" (UniqueName: "kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7") pod "network-check-target-g75rf" (UID: "c5e48953-ddd6-4f02-b2d7-7a42dede535f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:10.794760 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.794672 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:06:08 +0000 UTC" deadline="2027-11-02 20:46:46.870657767 +0000 UTC" Apr 20 20:11:10.794760 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.794708 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13464h35m36.075953176s" Apr 20 20:11:10.829966 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:10.829915 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3246feba_bd0f_4518_845a_c9025f63125b.slice/crio-acc471307795afc3ac49b793680db0d54c777e81016713634fae2fdd546ae686 WatchSource:0}: Error finding container acc471307795afc3ac49b793680db0d54c777e81016713634fae2fdd546ae686: Status 404 returned error can't find the container with id acc471307795afc3ac49b793680db0d54c777e81016713634fae2fdd546ae686 Apr 20 20:11:10.847303 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:10.847227 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35c64ee5_3248_42b5_92a6_7ceb4aaad134.slice/crio-87e31dfcbc1c1a12596a06131befecabcbae5f322f3a6d5148bdb709374adf75 WatchSource:0}: Error finding container 87e31dfcbc1c1a12596a06131befecabcbae5f322f3a6d5148bdb709374adf75: Status 404 returned error can't find the container with id 87e31dfcbc1c1a12596a06131befecabcbae5f322f3a6d5148bdb709374adf75 Apr 20 20:11:10.849535 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:10.848103 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47af8f64_fc60_48b5_84b3_4c3888a04afa.slice/crio-042f2fcf72522e84cb41b08a89ddc291617519f3f8beb4834f402b746ae133cf WatchSource:0}: Error finding container 042f2fcf72522e84cb41b08a89ddc291617519f3f8beb4834f402b746ae133cf: Status 404 returned error can't find the container with id 042f2fcf72522e84cb41b08a89ddc291617519f3f8beb4834f402b746ae133cf Apr 20 20:11:10.849535 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:10.848922 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ff55742_6058_4a56_941a_aec639653897.slice/crio-54ad30bcf9ed14ec6144f0a819b0be93a7cd9c78d5857ec72e702676eab4684b WatchSource:0}: Error finding container 54ad30bcf9ed14ec6144f0a819b0be93a7cd9c78d5857ec72e702676eab4684b: Status 404 returned error can't find the container with id 54ad30bcf9ed14ec6144f0a819b0be93a7cd9c78d5857ec72e702676eab4684b Apr 20 20:11:10.850102 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:10.850066 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod188b7548_da8a_4c62_b93e_536b9b6e09ea.slice/crio-ec41c3fa2d089480735828535c21b9abf1023a752bc83864e591835aebe408ca WatchSource:0}: Error finding container ec41c3fa2d089480735828535c21b9abf1023a752bc83864e591835aebe408ca: Status 404 returned error can't find the container with id ec41c3fa2d089480735828535c21b9abf1023a752bc83864e591835aebe408ca Apr 20 20:11:10.850538 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:10.850495 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d7257b5_0312_4d95_b84e_bf7589326417.slice/crio-a37f75a40667d2b5722e144bfbb853e68a347533372c881100f6147102325527 WatchSource:0}: Error finding container a37f75a40667d2b5722e144bfbb853e68a347533372c881100f6147102325527: Status 404 returned error can't find the container with id a37f75a40667d2b5722e144bfbb853e68a347533372c881100f6147102325527 Apr 20 20:11:10.852208 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:10.851974 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9a81a97_edd6_47c8_a5ab_611a4f60d22a.slice/crio-6a3714d22f8988798b65e1383537dfc7692daa3a547b4d37c0adddf7a4c55c7d WatchSource:0}: Error finding container 6a3714d22f8988798b65e1383537dfc7692daa3a547b4d37c0adddf7a4c55c7d: Status 404 returned error can't find the container with id 6a3714d22f8988798b65e1383537dfc7692daa3a547b4d37c0adddf7a4c55c7d Apr 20 20:11:10.852473 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:10.852451 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded990d2c_a5ce_4f8a_b80f_44ae67582cd8.slice/crio-3b82ab7ea71db30682435478ec86ad9aea45f8e86e2fd2049caf2bd164a50a8e WatchSource:0}: Error finding container 3b82ab7ea71db30682435478ec86ad9aea45f8e86e2fd2049caf2bd164a50a8e: Status 404 returned error can't find the container with id 3b82ab7ea71db30682435478ec86ad9aea45f8e86e2fd2049caf2bd164a50a8e Apr 20 20:11:10.853509 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:11:10.853486 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5e335ef_aa11_4058_9448_d253b1a467c3.slice/crio-0620fbe8f17081eacbc0cad50c05b01e76d0d591446ccf118f0f68382ab9ff4c WatchSource:0}: Error finding container 0620fbe8f17081eacbc0cad50c05b01e76d0d591446ccf118f0f68382ab9ff4c: Status 404 returned error can't find the container with id 0620fbe8f17081eacbc0cad50c05b01e76d0d591446ccf118f0f68382ab9ff4c Apr 20 20:11:10.886231 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.886211 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:10.886327 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.886215 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:10.886440 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:10.886322 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:10.886440 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:10.886415 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:10.899076 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.898918 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xq679" event={"ID":"b5e335ef-aa11-4058-9448-d253b1a467c3","Type":"ContainerStarted","Data":"0620fbe8f17081eacbc0cad50c05b01e76d0d591446ccf118f0f68382ab9ff4c"} Apr 20 20:11:10.899948 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.899931 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" event={"ID":"e9a81a97-edd6-47c8-a5ab-611a4f60d22a","Type":"ContainerStarted","Data":"6a3714d22f8988798b65e1383537dfc7692daa3a547b4d37c0adddf7a4c55c7d"} Apr 20 20:11:10.901002 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.900985 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" event={"ID":"4d7257b5-0312-4d95-b84e-bf7589326417","Type":"ContainerStarted","Data":"a37f75a40667d2b5722e144bfbb853e68a347533372c881100f6147102325527"} Apr 20 20:11:10.902135 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.902110 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9fj49" event={"ID":"188b7548-da8a-4c62-b93e-536b9b6e09ea","Type":"ContainerStarted","Data":"ec41c3fa2d089480735828535c21b9abf1023a752bc83864e591835aebe408ca"} Apr 20 20:11:10.903177 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.903158 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxkm" event={"ID":"2ff55742-6058-4a56-941a-aec639653897","Type":"ContainerStarted","Data":"54ad30bcf9ed14ec6144f0a819b0be93a7cd9c78d5857ec72e702676eab4684b"} Apr 20 20:11:10.903978 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.903959 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l9zp2" event={"ID":"47af8f64-fc60-48b5-84b3-4c3888a04afa","Type":"ContainerStarted","Data":"042f2fcf72522e84cb41b08a89ddc291617519f3f8beb4834f402b746ae133cf"} Apr 20 20:11:10.904994 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.904967 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dnglx" event={"ID":"ed990d2c-a5ce-4f8a-b80f-44ae67582cd8","Type":"ContainerStarted","Data":"3b82ab7ea71db30682435478ec86ad9aea45f8e86e2fd2049caf2bd164a50a8e"} Apr 20 20:11:10.905931 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.905914 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" event={"ID":"35c64ee5-3248-42b5-92a6-7ceb4aaad134","Type":"ContainerStarted","Data":"87e31dfcbc1c1a12596a06131befecabcbae5f322f3a6d5148bdb709374adf75"} Apr 20 20:11:10.906771 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:10.906755 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nd6kq" event={"ID":"3246feba-bd0f-4518-845a-c9025f63125b","Type":"ContainerStarted","Data":"acc471307795afc3ac49b793680db0d54c777e81016713634fae2fdd546ae686"} Apr 20 20:11:11.377177 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:11.376851 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:11.377177 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:11.377043 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:11.377177 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:11.377100 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret podName:06fca970-a9c2-455e-87f0-219af27b7cfd nodeName:}" failed. No retries permitted until 2026-04-20 20:11:13.377082532 +0000 UTC m=+6.055349569 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret") pod "global-pull-secret-syncer-2cxr4" (UID: "06fca970-a9c2-455e-87f0-219af27b7cfd") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:11.477667 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:11.477636 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs\") pod \"network-metrics-daemon-zc8kt\" (UID: \"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d\") " pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:11.477838 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:11.477779 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:11.477838 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:11.477833 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs podName:a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d nodeName:}" failed. No retries permitted until 2026-04-20 20:11:13.47781578 +0000 UTC m=+6.156082819 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs") pod "network-metrics-daemon-zc8kt" (UID: "a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:11.578729 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:11.578699 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fn7\" (UniqueName: \"kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7\") pod \"network-check-target-g75rf\" (UID: \"c5e48953-ddd6-4f02-b2d7-7a42dede535f\") " pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:11.579100 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:11.578886 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:11.579100 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:11.578909 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:11.579100 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:11.578922 2578 projected.go:194] Error preparing data for projected volume kube-api-access-r2fn7 for pod openshift-network-diagnostics/network-check-target-g75rf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:11.579100 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:11.578974 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7 podName:c5e48953-ddd6-4f02-b2d7-7a42dede535f nodeName:}" failed. No retries permitted until 2026-04-20 20:11:13.57895551 +0000 UTC m=+6.257222547 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-r2fn7" (UniqueName: "kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7") pod "network-check-target-g75rf" (UID: "c5e48953-ddd6-4f02-b2d7-7a42dede535f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:11.888958 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:11.888395 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:11.888958 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:11.888535 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:11.927178 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:11.926616 2578 generic.go:358] "Generic (PLEG): container finished" podID="2993d233f9a00d06d5f483ee8282a6f2" containerID="186e79b92cb090e8b2c9565b192f4974d586364e47e162454bb862345ce26b81" exitCode=0 Apr 20 20:11:11.927178 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:11.926780 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" event={"ID":"2993d233f9a00d06d5f483ee8282a6f2","Type":"ContainerDied","Data":"186e79b92cb090e8b2c9565b192f4974d586364e47e162454bb862345ce26b81"} Apr 20 20:11:11.964844 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:11.964313 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" event={"ID":"19015d708a7f5256313024ebc4553800","Type":"ContainerStarted","Data":"58ed959a51b44d2a6df0baee450ca7506b2f0737762ea858c9cefcf2ec17c7d7"} Apr 20 20:11:11.979397 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:11.978393 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-66.ec2.internal" podStartSLOduration=2.978377012 podStartE2EDuration="2.978377012s" podCreationTimestamp="2026-04-20 20:11:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:11:11.978316367 +0000 UTC m=+4.656583427" watchObservedRunningTime="2026-04-20 20:11:11.978377012 +0000 UTC m=+4.656644070" Apr 20 20:11:12.886168 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:12.886097 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:12.886349 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:12.886215 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:12.886647 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:12.886626 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:12.886742 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:12.886721 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:12.976711 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:12.976661 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" event={"ID":"2993d233f9a00d06d5f483ee8282a6f2","Type":"ContainerStarted","Data":"935766e7a331ce2ee68097b198f58664ba1509edf2ced5537f886bf98007121b"} Apr 20 20:11:12.991967 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:12.991921 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-66.ec2.internal" podStartSLOduration=3.99190531 podStartE2EDuration="3.99190531s" podCreationTimestamp="2026-04-20 20:11:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:11:12.990830968 +0000 UTC m=+5.669098025" watchObservedRunningTime="2026-04-20 20:11:12.99190531 +0000 UTC m=+5.670172369" Apr 20 20:11:13.395597 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:13.395540 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:13.395776 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:13.395746 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:13.395832 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:13.395802 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret podName:06fca970-a9c2-455e-87f0-219af27b7cfd nodeName:}" failed. No retries permitted until 2026-04-20 20:11:17.395784637 +0000 UTC m=+10.074051672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret") pod "global-pull-secret-syncer-2cxr4" (UID: "06fca970-a9c2-455e-87f0-219af27b7cfd") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:13.496152 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:13.496118 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs\") pod \"network-metrics-daemon-zc8kt\" (UID: \"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d\") " pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:13.496318 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:13.496293 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:13.496373 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:13.496348 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs podName:a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d nodeName:}" failed. No retries permitted until 2026-04-20 20:11:17.496330511 +0000 UTC m=+10.174597550 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs") pod "network-metrics-daemon-zc8kt" (UID: "a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:13.597828 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:13.597256 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fn7\" (UniqueName: \"kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7\") pod \"network-check-target-g75rf\" (UID: \"c5e48953-ddd6-4f02-b2d7-7a42dede535f\") " pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:13.597828 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:13.597396 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:13.597828 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:13.597413 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:13.597828 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:13.597425 2578 projected.go:194] Error preparing data for projected volume kube-api-access-r2fn7 for pod openshift-network-diagnostics/network-check-target-g75rf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:13.597828 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:13.597478 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7 podName:c5e48953-ddd6-4f02-b2d7-7a42dede535f nodeName:}" failed. No retries permitted until 2026-04-20 20:11:17.597458637 +0000 UTC m=+10.275725677 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-r2fn7" (UniqueName: "kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7") pod "network-check-target-g75rf" (UID: "c5e48953-ddd6-4f02-b2d7-7a42dede535f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:13.890101 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:13.890067 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:13.890283 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:13.890204 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:14.886657 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:14.886624 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:14.887092 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:14.886624 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:14.887092 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:14.886757 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:14.887092 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:14.886815 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:15.886731 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:15.886693 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:15.887170 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:15.886822 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:16.886562 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:16.886528 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:16.886746 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:16.886564 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:16.886746 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:16.886669 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:16.887095 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:16.886785 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:17.428133 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:17.428099 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:17.428311 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:17.428292 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:17.428387 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:17.428357 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret podName:06fca970-a9c2-455e-87f0-219af27b7cfd nodeName:}" failed. No retries permitted until 2026-04-20 20:11:25.428338654 +0000 UTC m=+18.106605692 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret") pod "global-pull-secret-syncer-2cxr4" (UID: "06fca970-a9c2-455e-87f0-219af27b7cfd") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:17.529012 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:17.528977 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs\") pod \"network-metrics-daemon-zc8kt\" (UID: \"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d\") " pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:17.529186 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:17.529139 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:17.529247 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:17.529197 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs podName:a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d nodeName:}" failed. No retries permitted until 2026-04-20 20:11:25.529177945 +0000 UTC m=+18.207444986 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs") pod "network-metrics-daemon-zc8kt" (UID: "a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:17.629681 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:17.629636 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fn7\" (UniqueName: \"kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7\") pod \"network-check-target-g75rf\" (UID: \"c5e48953-ddd6-4f02-b2d7-7a42dede535f\") " pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:17.629907 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:17.629875 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:17.629997 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:17.629914 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:17.629997 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:17.629928 2578 projected.go:194] Error preparing data for projected volume kube-api-access-r2fn7 for pod openshift-network-diagnostics/network-check-target-g75rf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:17.630100 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:17.630012 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7 podName:c5e48953-ddd6-4f02-b2d7-7a42dede535f nodeName:}" failed. No retries permitted until 2026-04-20 20:11:25.629973511 +0000 UTC m=+18.308240560 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-r2fn7" (UniqueName: "kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7") pod "network-check-target-g75rf" (UID: "c5e48953-ddd6-4f02-b2d7-7a42dede535f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:17.887633 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:17.887539 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:17.887952 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:17.887664 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:18.886235 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:18.886205 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:18.886430 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:18.886312 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:18.886430 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:18.886205 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:18.886536 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:18.886437 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:19.888332 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:19.886328 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:19.888332 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:19.886471 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:20.886790 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:20.886753 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:20.886943 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:20.886854 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:20.887159 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:20.886753 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:20.887230 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:20.887214 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:21.886199 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:21.886166 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:21.886651 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:21.886287 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:22.886948 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:22.886914 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:22.887373 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:22.886920 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:22.887373 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:22.887049 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:22.887373 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:22.887146 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:23.886629 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:23.886601 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:23.886805 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:23.886716 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:24.886818 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:24.886781 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:24.887241 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:24.886819 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:24.887241 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:24.886921 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:24.887241 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:24.887066 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:25.487028 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:25.486986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:25.487207 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:25.487106 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:25.487207 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:25.487189 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret podName:06fca970-a9c2-455e-87f0-219af27b7cfd nodeName:}" failed. No retries permitted until 2026-04-20 20:11:41.48717055 +0000 UTC m=+34.165437588 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret") pod "global-pull-secret-syncer-2cxr4" (UID: "06fca970-a9c2-455e-87f0-219af27b7cfd") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:25.587627 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:25.587573 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs\") pod \"network-metrics-daemon-zc8kt\" (UID: \"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d\") " pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:25.587808 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:25.587694 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:25.587808 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:25.587759 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs podName:a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d nodeName:}" failed. No retries permitted until 2026-04-20 20:11:41.587737245 +0000 UTC m=+34.266004285 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs") pod "network-metrics-daemon-zc8kt" (UID: "a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:25.688733 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:25.688699 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fn7\" (UniqueName: \"kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7\") pod \"network-check-target-g75rf\" (UID: \"c5e48953-ddd6-4f02-b2d7-7a42dede535f\") " pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:25.688907 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:25.688887 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:25.688907 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:25.688905 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:25.689030 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:25.688919 2578 projected.go:194] Error preparing data for projected volume kube-api-access-r2fn7 for pod openshift-network-diagnostics/network-check-target-g75rf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:25.689030 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:25.688974 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7 podName:c5e48953-ddd6-4f02-b2d7-7a42dede535f nodeName:}" failed. No retries permitted until 2026-04-20 20:11:41.688955703 +0000 UTC m=+34.367222755 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-r2fn7" (UniqueName: "kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7") pod "network-check-target-g75rf" (UID: "c5e48953-ddd6-4f02-b2d7-7a42dede535f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:25.889292 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:25.889215 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:25.889694 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:25.889320 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:26.886655 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:26.886623 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:26.886655 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:26.886650 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:26.887119 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:26.887086 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:26.887246 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:26.887178 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:27.894974 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:27.894948 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:27.895374 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:27.895080 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:28.014738 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:28.014280 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" event={"ID":"35c64ee5-3248-42b5-92a6-7ceb4aaad134","Type":"ContainerStarted","Data":"fb594ac82a2ce45b23a9f954f95b7b9f4cc71b8d3b99abfbc0709672f72c0d4f"} Apr 20 20:11:28.021050 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:28.016173 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" event={"ID":"e9a81a97-edd6-47c8-a5ab-611a4f60d22a","Type":"ContainerStarted","Data":"fa3761feeb5a95ef334c873e27eb4af8387c44e78bc61fb4ff9df3f91ff2a7c0"} Apr 20 20:11:28.886128 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:28.886099 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:28.886128 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:28.886135 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:28.886303 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:28.886204 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:28.886303 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:28.886279 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:29.029442 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.029385 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:11:29.030218 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.029708 2578 generic.go:358] "Generic (PLEG): container finished" podID="4d7257b5-0312-4d95-b84e-bf7589326417" containerID="576c63ecf19ee6bfe3ee3cfa736cb46cc4757d8ea0843ac56d1538267454e04e" exitCode=1 Apr 20 20:11:29.030218 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.029763 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" event={"ID":"4d7257b5-0312-4d95-b84e-bf7589326417","Type":"ContainerStarted","Data":"d58287ae2e7df887074b63cb0f530b7e284f7133d8a058dbea3580bb87f50601"} Apr 20 20:11:29.030218 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.029792 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" event={"ID":"4d7257b5-0312-4d95-b84e-bf7589326417","Type":"ContainerStarted","Data":"00dbd253f448da74db59a605e88b4d86d60aad5f2061d6291647739d1c494e79"} Apr 20 20:11:29.030218 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.029801 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" event={"ID":"4d7257b5-0312-4d95-b84e-bf7589326417","Type":"ContainerStarted","Data":"adbddbd5f0d9105a394a4ea13488a8d20ea0ec299ce1214650d73909de9b2972"} Apr 20 20:11:29.030218 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.029808 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" event={"ID":"4d7257b5-0312-4d95-b84e-bf7589326417","Type":"ContainerStarted","Data":"e33b3fa988ea2a8e9aa526225eb10e13e660dc5e75a305b9afe390aad3fa6927"} Apr 20 20:11:29.030218 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.029822 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" event={"ID":"4d7257b5-0312-4d95-b84e-bf7589326417","Type":"ContainerStarted","Data":"a11bcbc1416dfa74c480ceadcdc9141e66e051ae626596dca46bc41aefa93540"} Apr 20 20:11:29.030218 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.029830 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" event={"ID":"4d7257b5-0312-4d95-b84e-bf7589326417","Type":"ContainerDied","Data":"576c63ecf19ee6bfe3ee3cfa736cb46cc4757d8ea0843ac56d1538267454e04e"} Apr 20 20:11:29.031104 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.031085 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9fj49" event={"ID":"188b7548-da8a-4c62-b93e-536b9b6e09ea","Type":"ContainerStarted","Data":"e2e5d4e2b578db4cfb15f8b502e36d3af1d287071d58eaa07d10e4cbef87b1df"} Apr 20 20:11:29.032372 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.032350 2578 generic.go:358] "Generic (PLEG): container finished" podID="2ff55742-6058-4a56-941a-aec639653897" containerID="5846e127732fec0a74d05f589c62006d7bec2878ba5f2fd64b494c219538f02c" exitCode=0 Apr 20 20:11:29.032488 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.032407 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxkm" event={"ID":"2ff55742-6058-4a56-941a-aec639653897","Type":"ContainerDied","Data":"5846e127732fec0a74d05f589c62006d7bec2878ba5f2fd64b494c219538f02c"} Apr 20 20:11:29.033707 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.033685 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l9zp2" event={"ID":"47af8f64-fc60-48b5-84b3-4c3888a04afa","Type":"ContainerStarted","Data":"3caf70a1059a7daf47eb7f5f1f14d6179ba52e12fdb4f8231e390dd9c04228bd"} Apr 20 20:11:29.034986 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.034968 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nd6kq" event={"ID":"3246feba-bd0f-4518-845a-c9025f63125b","Type":"ContainerStarted","Data":"825eac1d6940c55f818717e1d191a6fb93fddd38d48c0e5ecf3215ab411cfda6"} Apr 20 20:11:29.036163 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.036138 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xq679" event={"ID":"b5e335ef-aa11-4058-9448-d253b1a467c3","Type":"ContainerStarted","Data":"32279d6a713ad2ac8a7430e84e5d65c6f64c6f23d95391f9b47bd1ea058d88b0"} Apr 20 20:11:29.048058 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.048023 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qg8wq" podStartSLOduration=4.245321337 podStartE2EDuration="21.048010205s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.854921842 +0000 UTC m=+3.533188877" lastFinishedPulling="2026-04-20 20:11:27.657610704 +0000 UTC m=+20.335877745" observedRunningTime="2026-04-20 20:11:28.038407047 +0000 UTC m=+20.716674105" watchObservedRunningTime="2026-04-20 20:11:29.048010205 +0000 UTC m=+21.726277264" Apr 20 20:11:29.048175 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.048154 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9fj49" podStartSLOduration=4.082247955 podStartE2EDuration="21.0481498s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.851788977 +0000 UTC m=+3.530056011" lastFinishedPulling="2026-04-20 20:11:27.817690817 +0000 UTC m=+20.495957856" observedRunningTime="2026-04-20 20:11:29.047618324 +0000 UTC m=+21.725885376" watchObservedRunningTime="2026-04-20 20:11:29.0481498 +0000 UTC m=+21.726416835" Apr 20 20:11:29.071997 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.071965 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-nd6kq" podStartSLOduration=4.12154075 podStartE2EDuration="21.071955236s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.839669874 +0000 UTC m=+3.517936909" lastFinishedPulling="2026-04-20 20:11:27.790084346 +0000 UTC m=+20.468351395" observedRunningTime="2026-04-20 20:11:29.071827122 +0000 UTC m=+21.750094176" watchObservedRunningTime="2026-04-20 20:11:29.071955236 +0000 UTC m=+21.750222273" Apr 20 20:11:29.095722 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.095688 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-l9zp2" podStartSLOduration=5.287880299 podStartE2EDuration="22.095679493s" podCreationTimestamp="2026-04-20 20:11:07 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.84979577 +0000 UTC m=+3.528062821" lastFinishedPulling="2026-04-20 20:11:27.657594977 +0000 UTC m=+20.335862015" observedRunningTime="2026-04-20 20:11:29.082879874 +0000 UTC m=+21.761146931" watchObservedRunningTime="2026-04-20 20:11:29.095679493 +0000 UTC m=+21.773946601" Apr 20 20:11:29.488898 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.488872 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 20:11:29.803409 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.803311 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T20:11:29.488894364Z","UUID":"e8338a24-32ff-4667-8ba0-644379ee44e1","Handler":null,"Name":"","Endpoint":""} Apr 20 20:11:29.806242 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.806215 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 20:11:29.806242 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.806245 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 20:11:29.886545 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:29.886520 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:29.886750 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:29.886652 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:30.039978 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:30.039947 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dnglx" event={"ID":"ed990d2c-a5ce-4f8a-b80f-44ae67582cd8","Type":"ContainerStarted","Data":"95bba0b2267ee89b23c7d9c2e578b1a1fb2e2ed6c993122fa146013c60cb6660"} Apr 20 20:11:30.041787 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:30.041757 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" event={"ID":"35c64ee5-3248-42b5-92a6-7ceb4aaad134","Type":"ContainerStarted","Data":"3b5fd53c8b296b78db886712fc3af3c41f70827b194e14a4f6a598e833b78ff4"} Apr 20 20:11:30.053241 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:30.053204 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xq679" podStartSLOduration=5.137484954 podStartE2EDuration="22.053191382s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.85603463 +0000 UTC m=+3.534301680" lastFinishedPulling="2026-04-20 20:11:27.77174106 +0000 UTC m=+20.450008108" observedRunningTime="2026-04-20 20:11:29.095431021 +0000 UTC m=+21.773698077" watchObservedRunningTime="2026-04-20 20:11:30.053191382 +0000 UTC m=+22.731458439" Apr 20 20:11:30.053892 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:30.053835 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-dnglx" podStartSLOduration=5.115805276 podStartE2EDuration="22.053822029s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.85488914 +0000 UTC m=+3.533156177" lastFinishedPulling="2026-04-20 20:11:27.79290588 +0000 UTC m=+20.471172930" observedRunningTime="2026-04-20 20:11:30.053035679 +0000 UTC m=+22.731302736" watchObservedRunningTime="2026-04-20 20:11:30.053822029 +0000 UTC m=+22.732089087" Apr 20 20:11:30.886205 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:30.886136 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:30.886332 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:30.886260 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:30.886687 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:30.886663 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:30.886861 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:30.886758 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:31.045815 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:31.045773 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" event={"ID":"35c64ee5-3248-42b5-92a6-7ceb4aaad134","Type":"ContainerStarted","Data":"dcb9a3898887f577844db818e3932520276b531598aee04b2a5ebcd23166a87f"} Apr 20 20:11:31.048650 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:31.048629 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:11:31.049053 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:31.049027 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" event={"ID":"4d7257b5-0312-4d95-b84e-bf7589326417","Type":"ContainerStarted","Data":"6d75250fca7772710631fd1205703d769e156292c5011e5946eae94a25bcbba0"} Apr 20 20:11:31.066462 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:31.066417 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-knjxf" podStartSLOduration=4.093173574 podStartE2EDuration="24.066404364s" podCreationTimestamp="2026-04-20 20:11:07 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.849091583 +0000 UTC m=+3.527358621" lastFinishedPulling="2026-04-20 20:11:30.822322362 +0000 UTC m=+23.500589411" observedRunningTime="2026-04-20 20:11:31.065844877 +0000 UTC m=+23.744111935" watchObservedRunningTime="2026-04-20 20:11:31.066404364 +0000 UTC m=+23.744671420" Apr 20 20:11:31.886274 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:31.886048 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:31.886471 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:31.886379 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:32.040739 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:32.040707 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-l9zp2" Apr 20 20:11:32.041339 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:32.041317 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-l9zp2" Apr 20 20:11:32.051280 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:32.051236 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-l9zp2" Apr 20 20:11:32.051727 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:32.051534 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-l9zp2" Apr 20 20:11:32.886906 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:32.886874 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:32.887065 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:32.886874 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:32.887065 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:32.886980 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:32.887174 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:32.887088 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:33.887042 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:33.886852 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:33.887778 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:33.887142 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:34.056678 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:34.056651 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:11:34.057016 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:34.056993 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" event={"ID":"4d7257b5-0312-4d95-b84e-bf7589326417","Type":"ContainerStarted","Data":"50019304a7a645d1165bf7bf222421423544a40de3d25f4531760e9dec95a744"} Apr 20 20:11:34.057322 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:34.057304 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:34.057454 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:34.057440 2578 scope.go:117] "RemoveContainer" containerID="576c63ecf19ee6bfe3ee3cfa736cb46cc4757d8ea0843ac56d1538267454e04e" Apr 20 20:11:34.058653 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:34.058628 2578 generic.go:358] "Generic (PLEG): container finished" podID="2ff55742-6058-4a56-941a-aec639653897" containerID="f32a326b438fb11a38fb403f64563bbe11fcf0a42cda0dcc3cb86bec52a8aa16" exitCode=0 Apr 20 20:11:34.058776 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:34.058705 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxkm" event={"ID":"2ff55742-6058-4a56-941a-aec639653897","Type":"ContainerDied","Data":"f32a326b438fb11a38fb403f64563bbe11fcf0a42cda0dcc3cb86bec52a8aa16"} Apr 20 20:11:34.074214 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:34.074195 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:34.886865 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:34.886817 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:34.887012 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:34.886817 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:34.887012 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:34.886940 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:34.887433 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:34.887013 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:35.066389 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:35.066362 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:11:35.066724 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:35.066702 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" event={"ID":"4d7257b5-0312-4d95-b84e-bf7589326417","Type":"ContainerStarted","Data":"d52565b7c1aeb954b9fc28570ab7d1d01161c6afbf204092c5a71aeb970c0214"} Apr 20 20:11:35.066917 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:35.066900 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:35.066956 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:35.066926 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:35.068683 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:35.068633 2578 generic.go:358] "Generic (PLEG): container finished" podID="2ff55742-6058-4a56-941a-aec639653897" containerID="ea7b5bc7a696bdc39b16e9dec8e05b19b891203617cffff3b3e73380c6a0750a" exitCode=0 Apr 20 20:11:35.068795 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:35.068704 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxkm" event={"ID":"2ff55742-6058-4a56-941a-aec639653897","Type":"ContainerDied","Data":"ea7b5bc7a696bdc39b16e9dec8e05b19b891203617cffff3b3e73380c6a0750a"} Apr 20 20:11:35.081800 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:35.081780 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:11:35.093048 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:35.093012 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" podStartSLOduration=10.086744404 podStartE2EDuration="27.093000827s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.853149991 +0000 UTC m=+3.531417029" lastFinishedPulling="2026-04-20 20:11:27.859406417 +0000 UTC m=+20.537673452" observedRunningTime="2026-04-20 20:11:35.09111264 +0000 UTC m=+27.769379697" watchObservedRunningTime="2026-04-20 20:11:35.093000827 +0000 UTC m=+27.771267950" Apr 20 20:11:35.112703 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:35.112679 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-g75rf"] Apr 20 20:11:35.112835 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:35.112790 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:35.112984 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:35.112954 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:35.113304 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:35.113284 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2cxr4"] Apr 20 20:11:35.113390 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:35.113382 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:35.113481 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:35.113465 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:35.123832 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:35.123799 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zc8kt"] Apr 20 20:11:35.123932 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:35.123915 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:35.124042 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:35.124014 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:36.072140 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:36.072106 2578 generic.go:358] "Generic (PLEG): container finished" podID="2ff55742-6058-4a56-941a-aec639653897" containerID="4c08e6a3870831bbf56fca1c55d3b0f38345e64112e71cd79ba0e381544e9569" exitCode=0 Apr 20 20:11:36.072508 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:36.072187 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxkm" event={"ID":"2ff55742-6058-4a56-941a-aec639653897","Type":"ContainerDied","Data":"4c08e6a3870831bbf56fca1c55d3b0f38345e64112e71cd79ba0e381544e9569"} Apr 20 20:11:36.886460 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:36.886422 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:36.886639 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:36.886422 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:36.886639 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:36.886557 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:36.886774 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:36.886736 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:36.886844 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:36.886806 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:36.886902 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:36.886885 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:38.886306 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:38.886094 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:38.886763 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:38.886098 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:38.886763 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:38.886475 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:38.886763 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:38.886382 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:38.886763 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:38.886137 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:38.886763 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:38.886566 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:40.886566 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:40.886523 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:40.887269 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:40.886529 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:40.887269 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:40.886523 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:40.887269 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:40.886684 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2cxr4" podUID="06fca970-a9c2-455e-87f0-219af27b7cfd" Apr 20 20:11:40.887269 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:40.886777 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:11:40.887269 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:40.886838 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g75rf" podUID="c5e48953-ddd6-4f02-b2d7-7a42dede535f" Apr 20 20:11:41.506713 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.506669 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:41.506880 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:41.506810 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:41.506880 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:41.506877 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret podName:06fca970-a9c2-455e-87f0-219af27b7cfd nodeName:}" failed. No retries permitted until 2026-04-20 20:12:13.506859336 +0000 UTC m=+66.185126376 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret") pod "global-pull-secret-syncer-2cxr4" (UID: "06fca970-a9c2-455e-87f0-219af27b7cfd") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:11:41.607783 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.607745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs\") pod \"network-metrics-daemon-zc8kt\" (UID: \"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d\") " pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:41.607952 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:41.607892 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:41.607952 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:41.607946 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs podName:a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d nodeName:}" failed. No retries permitted until 2026-04-20 20:12:13.607932411 +0000 UTC m=+66.286199446 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs") pod "network-metrics-daemon-zc8kt" (UID: "a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:11:41.651621 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.651521 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-66.ec2.internal" event="NodeReady" Apr 20 20:11:41.651774 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.651703 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 20:11:41.685112 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.685076 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5f9bf74767-6pq9d"] Apr 20 20:11:41.708943 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.708906 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fn7\" (UniqueName: \"kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7\") pod \"network-check-target-g75rf\" (UID: \"c5e48953-ddd6-4f02-b2d7-7a42dede535f\") " pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:41.709108 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:41.709066 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:11:41.709108 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:41.709085 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:11:41.709108 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:41.709098 2578 projected.go:194] Error preparing data for projected volume kube-api-access-r2fn7 for pod openshift-network-diagnostics/network-check-target-g75rf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:41.709227 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:41.709148 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7 podName:c5e48953-ddd6-4f02-b2d7-7a42dede535f nodeName:}" failed. No retries permitted until 2026-04-20 20:12:13.709134611 +0000 UTC m=+66.387401646 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-r2fn7" (UniqueName: "kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7") pod "network-check-target-g75rf" (UID: "c5e48953-ddd6-4f02-b2d7-7a42dede535f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:11:41.721746 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.721718 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f9bf74767-6pq9d"] Apr 20 20:11:41.721872 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.721754 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8zpg6"] Apr 20 20:11:41.721936 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.721872 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.724923 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.724468 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 20:11:41.724923 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.724607 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rb8vs\"" Apr 20 20:11:41.724923 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.724712 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 20:11:41.724923 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.724801 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 20:11:41.731401 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.731384 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 20:11:41.735946 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.735927 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jp7bd"] Apr 20 20:11:41.736075 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.736061 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:41.738465 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.738446 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 20:11:41.738558 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.738477 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 20:11:41.738837 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.738820 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bkrnz\"" Apr 20 20:11:41.749291 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.749274 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8zpg6"] Apr 20 20:11:41.749375 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.749297 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jp7bd"] Apr 20 20:11:41.749411 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.749392 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:11:41.752441 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.752422 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rfvs6\"" Apr 20 20:11:41.752530 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.752451 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 20:11:41.752530 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.752472 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 20:11:41.752734 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.752720 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 20:11:41.809274 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.809243 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4764d884-9129-4d18-82c6-332686e75ae2-ca-trust-extracted\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.809441 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.809286 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7w6c\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-kube-api-access-r7w6c\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.809441 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.809346 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4764d884-9129-4d18-82c6-332686e75ae2-image-registry-private-configuration\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.809441 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.809407 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-bound-sa-token\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.809634 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.809501 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4764d884-9129-4d18-82c6-332686e75ae2-trusted-ca\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.809634 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.809549 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.809634 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.809600 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4764d884-9129-4d18-82c6-332686e75ae2-registry-certificates\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.809753 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.809635 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4764d884-9129-4d18-82c6-332686e75ae2-installation-pull-secrets\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.910318 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.910283 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4764d884-9129-4d18-82c6-332686e75ae2-installation-pull-secrets\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.910897 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.910330 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dfdj\" (UniqueName: \"kubernetes.io/projected/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-kube-api-access-4dfdj\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:41.910897 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.910368 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4764d884-9129-4d18-82c6-332686e75ae2-ca-trust-extracted\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.910897 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.910391 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7w6c\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-kube-api-access-r7w6c\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.910897 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.910431 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4764d884-9129-4d18-82c6-332686e75ae2-image-registry-private-configuration\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.910897 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.910473 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5mp7\" (UniqueName: \"kubernetes.io/projected/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-kube-api-access-d5mp7\") pod \"ingress-canary-jp7bd\" (UID: \"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7\") " pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:11:41.910897 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.910774 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4764d884-9129-4d18-82c6-332686e75ae2-ca-trust-extracted\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.910897 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.910811 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:41.910897 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.910873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-bound-sa-token\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.910897 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.910899 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-tmp-dir\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:41.911223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.910934 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert\") pod \"ingress-canary-jp7bd\" (UID: \"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7\") " pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:11:41.911223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.910966 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4764d884-9129-4d18-82c6-332686e75ae2-trusted-ca\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.911223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.911015 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-config-volume\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:41.911223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.911057 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.911223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.911090 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4764d884-9129-4d18-82c6-332686e75ae2-registry-certificates\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.911223 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:41.911213 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:11:41.911223 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:41.911227 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f9bf74767-6pq9d: secret "image-registry-tls" not found Apr 20 20:11:41.911429 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:41.911284 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls podName:4764d884-9129-4d18-82c6-332686e75ae2 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:42.411265784 +0000 UTC m=+35.089532818 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls") pod "image-registry-5f9bf74767-6pq9d" (UID: "4764d884-9129-4d18-82c6-332686e75ae2") : secret "image-registry-tls" not found Apr 20 20:11:41.915287 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.915265 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4764d884-9129-4d18-82c6-332686e75ae2-image-registry-private-configuration\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.915390 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.915289 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4764d884-9129-4d18-82c6-332686e75ae2-installation-pull-secrets\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.919903 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.919878 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4764d884-9129-4d18-82c6-332686e75ae2-registry-certificates\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.920344 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.920304 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7w6c\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-kube-api-access-r7w6c\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.920344 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.920304 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-bound-sa-token\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:41.920501 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:41.920483 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4764d884-9129-4d18-82c6-332686e75ae2-trusted-ca\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:42.011759 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.011725 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-tmp-dir\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:42.011929 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.011769 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert\") pod \"ingress-canary-jp7bd\" (UID: \"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7\") " pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:11:42.011929 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.011801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-config-volume\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:42.011929 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:42.011904 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:11:42.012098 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.011980 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dfdj\" (UniqueName: \"kubernetes.io/projected/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-kube-api-access-4dfdj\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:42.012098 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:42.011996 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert podName:3e10e1c4-64b7-4da9-be2e-38882fc2fbb7 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:42.511975596 +0000 UTC m=+35.190242649 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert") pod "ingress-canary-jp7bd" (UID: "3e10e1c4-64b7-4da9-be2e-38882fc2fbb7") : secret "canary-serving-cert" not found Apr 20 20:11:42.012098 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.012076 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5mp7\" (UniqueName: \"kubernetes.io/projected/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-kube-api-access-d5mp7\") pod \"ingress-canary-jp7bd\" (UID: \"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7\") " pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:11:42.012098 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.012080 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-tmp-dir\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:42.012291 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.012114 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:42.012291 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:42.012241 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:11:42.012291 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:42.012280 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls podName:14b4f842-f248-42a7-8dcd-51dcb4e2ab36 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:42.512267572 +0000 UTC m=+35.190534611 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls") pod "dns-default-8zpg6" (UID: "14b4f842-f248-42a7-8dcd-51dcb4e2ab36") : secret "dns-default-metrics-tls" not found Apr 20 20:11:42.012429 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.012313 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-config-volume\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:42.023463 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.023435 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dfdj\" (UniqueName: \"kubernetes.io/projected/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-kube-api-access-4dfdj\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:42.028596 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.028563 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5mp7\" (UniqueName: \"kubernetes.io/projected/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-kube-api-access-d5mp7\") pod \"ingress-canary-jp7bd\" (UID: \"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7\") " pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:11:42.414963 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.414927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:42.415167 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:42.415087 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:11:42.415167 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:42.415105 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f9bf74767-6pq9d: secret "image-registry-tls" not found Apr 20 20:11:42.415167 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:42.415159 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls podName:4764d884-9129-4d18-82c6-332686e75ae2 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:43.415140964 +0000 UTC m=+36.093408020 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls") pod "image-registry-5f9bf74767-6pq9d" (UID: "4764d884-9129-4d18-82c6-332686e75ae2") : secret "image-registry-tls" not found Apr 20 20:11:42.516017 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.515978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:42.516147 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.516034 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert\") pod \"ingress-canary-jp7bd\" (UID: \"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7\") " pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:11:42.516147 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:42.516141 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:11:42.516226 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:42.516142 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:11:42.516226 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:42.516196 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert podName:3e10e1c4-64b7-4da9-be2e-38882fc2fbb7 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:43.516180703 +0000 UTC m=+36.194447739 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert") pod "ingress-canary-jp7bd" (UID: "3e10e1c4-64b7-4da9-be2e-38882fc2fbb7") : secret "canary-serving-cert" not found Apr 20 20:11:42.516226 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:42.516220 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls podName:14b4f842-f248-42a7-8dcd-51dcb4e2ab36 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:43.516201748 +0000 UTC m=+36.194468782 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls") pod "dns-default-8zpg6" (UID: "14b4f842-f248-42a7-8dcd-51dcb4e2ab36") : secret "dns-default-metrics-tls" not found Apr 20 20:11:42.886357 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.886268 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:11:42.886529 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.886386 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:11:42.886529 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.886401 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:11:42.889317 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.889299 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 20:11:42.889511 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.889495 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-698bz\"" Apr 20 20:11:42.889592 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.889506 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:11:42.889592 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.889509 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:11:42.890325 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.890310 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-779fg\"" Apr 20 20:11:42.890398 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:42.890310 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:11:43.087475 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:43.087441 2578 generic.go:358] "Generic (PLEG): container finished" podID="2ff55742-6058-4a56-941a-aec639653897" containerID="48d5203a406a677a052b9e7748a9c3a33fc470147902f3753bc5ded9be2bdb4e" exitCode=0 Apr 20 20:11:43.087998 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:43.087485 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxkm" event={"ID":"2ff55742-6058-4a56-941a-aec639653897","Type":"ContainerDied","Data":"48d5203a406a677a052b9e7748a9c3a33fc470147902f3753bc5ded9be2bdb4e"} Apr 20 20:11:43.424751 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:43.424713 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:43.424897 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:43.424860 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:11:43.424897 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:43.424874 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f9bf74767-6pq9d: secret "image-registry-tls" not found Apr 20 20:11:43.424961 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:43.424927 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls podName:4764d884-9129-4d18-82c6-332686e75ae2 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:45.424911051 +0000 UTC m=+38.103178107 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls") pod "image-registry-5f9bf74767-6pq9d" (UID: "4764d884-9129-4d18-82c6-332686e75ae2") : secret "image-registry-tls" not found Apr 20 20:11:43.525917 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:43.525839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:43.525917 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:43.525903 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert\") pod \"ingress-canary-jp7bd\" (UID: \"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7\") " pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:11:43.526051 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:43.525993 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:11:43.526081 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:43.526064 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls podName:14b4f842-f248-42a7-8dcd-51dcb4e2ab36 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:45.526049485 +0000 UTC m=+38.204316523 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls") pod "dns-default-8zpg6" (UID: "14b4f842-f248-42a7-8dcd-51dcb4e2ab36") : secret "dns-default-metrics-tls" not found Apr 20 20:11:43.526120 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:43.525996 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:11:43.526151 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:43.526144 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert podName:3e10e1c4-64b7-4da9-be2e-38882fc2fbb7 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:45.526129877 +0000 UTC m=+38.204396914 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert") pod "ingress-canary-jp7bd" (UID: "3e10e1c4-64b7-4da9-be2e-38882fc2fbb7") : secret "canary-serving-cert" not found Apr 20 20:11:44.093003 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:44.092962 2578 generic.go:358] "Generic (PLEG): container finished" podID="2ff55742-6058-4a56-941a-aec639653897" containerID="d41b479c8c7385531ea50b910604b4ddc6c2a46ce2d63c3a7a924acceb5b7716" exitCode=0 Apr 20 20:11:44.093447 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:44.093032 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxkm" event={"ID":"2ff55742-6058-4a56-941a-aec639653897","Type":"ContainerDied","Data":"d41b479c8c7385531ea50b910604b4ddc6c2a46ce2d63c3a7a924acceb5b7716"} Apr 20 20:11:45.098172 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:45.097992 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdxkm" event={"ID":"2ff55742-6058-4a56-941a-aec639653897","Type":"ContainerStarted","Data":"ae93ef485ec13646ea7ccac3e6f7c93289874e3a0a88b7d53b453916aa6e9895"} Apr 20 20:11:45.119698 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:45.119652 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qdxkm" podStartSLOduration=5.760966363 podStartE2EDuration="37.11963644s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:11:10.851343869 +0000 UTC m=+3.529610903" lastFinishedPulling="2026-04-20 20:11:42.210013938 +0000 UTC m=+34.888280980" observedRunningTime="2026-04-20 20:11:45.118173387 +0000 UTC m=+37.796440444" watchObservedRunningTime="2026-04-20 20:11:45.11963644 +0000 UTC m=+37.797903496" Apr 20 20:11:45.440183 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:45.440143 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:45.440342 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:45.440263 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:11:45.440342 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:45.440282 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f9bf74767-6pq9d: secret "image-registry-tls" not found Apr 20 20:11:45.440342 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:45.440334 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls podName:4764d884-9129-4d18-82c6-332686e75ae2 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:49.440318745 +0000 UTC m=+42.118585780 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls") pod "image-registry-5f9bf74767-6pq9d" (UID: "4764d884-9129-4d18-82c6-332686e75ae2") : secret "image-registry-tls" not found Apr 20 20:11:45.541054 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:45.541014 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert\") pod \"ingress-canary-jp7bd\" (UID: \"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7\") " pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:11:45.541227 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:45.541104 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:45.541227 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:45.541188 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:11:45.541227 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:45.541198 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:11:45.541375 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:45.541251 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls podName:14b4f842-f248-42a7-8dcd-51dcb4e2ab36 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:49.541238687 +0000 UTC m=+42.219505724 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls") pod "dns-default-8zpg6" (UID: "14b4f842-f248-42a7-8dcd-51dcb4e2ab36") : secret "dns-default-metrics-tls" not found Apr 20 20:11:45.541375 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:45.541263 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert podName:3e10e1c4-64b7-4da9-be2e-38882fc2fbb7 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:49.541257835 +0000 UTC m=+42.219524870 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert") pod "ingress-canary-jp7bd" (UID: "3e10e1c4-64b7-4da9-be2e-38882fc2fbb7") : secret "canary-serving-cert" not found Apr 20 20:11:49.468708 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:49.468650 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:49.469097 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:49.468797 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:11:49.469097 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:49.468816 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f9bf74767-6pq9d: secret "image-registry-tls" not found Apr 20 20:11:49.469097 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:49.468871 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls podName:4764d884-9129-4d18-82c6-332686e75ae2 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:57.468855533 +0000 UTC m=+50.147122572 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls") pod "image-registry-5f9bf74767-6pq9d" (UID: "4764d884-9129-4d18-82c6-332686e75ae2") : secret "image-registry-tls" not found Apr 20 20:11:49.569227 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:49.569175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:49.569346 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:49.569249 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert\") pod \"ingress-canary-jp7bd\" (UID: \"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7\") " pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:11:49.569425 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:49.569350 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:11:49.569425 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:49.569368 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:11:49.569425 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:49.569411 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert podName:3e10e1c4-64b7-4da9-be2e-38882fc2fbb7 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:57.569397394 +0000 UTC m=+50.247664429 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert") pod "ingress-canary-jp7bd" (UID: "3e10e1c4-64b7-4da9-be2e-38882fc2fbb7") : secret "canary-serving-cert" not found Apr 20 20:11:49.569537 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:49.569448 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls podName:14b4f842-f248-42a7-8dcd-51dcb4e2ab36 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:57.569426761 +0000 UTC m=+50.247693806 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls") pod "dns-default-8zpg6" (UID: "14b4f842-f248-42a7-8dcd-51dcb4e2ab36") : secret "dns-default-metrics-tls" not found Apr 20 20:11:57.524998 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:57.524945 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:11:57.525511 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:57.525105 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:11:57.525511 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:57.525125 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f9bf74767-6pq9d: secret "image-registry-tls" not found Apr 20 20:11:57.525511 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:57.525182 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls podName:4764d884-9129-4d18-82c6-332686e75ae2 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:13.525165938 +0000 UTC m=+66.203432973 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls") pod "image-registry-5f9bf74767-6pq9d" (UID: "4764d884-9129-4d18-82c6-332686e75ae2") : secret "image-registry-tls" not found Apr 20 20:11:57.626125 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:57.626091 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:11:57.626269 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:11:57.626153 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert\") pod \"ingress-canary-jp7bd\" (UID: \"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7\") " pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:11:57.626307 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:57.626265 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:11:57.626339 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:57.626303 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:11:57.626339 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:57.626326 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls podName:14b4f842-f248-42a7-8dcd-51dcb4e2ab36 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:13.626308468 +0000 UTC m=+66.304575526 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls") pod "dns-default-8zpg6" (UID: "14b4f842-f248-42a7-8dcd-51dcb4e2ab36") : secret "dns-default-metrics-tls" not found Apr 20 20:11:57.626412 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:11:57.626352 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert podName:3e10e1c4-64b7-4da9-be2e-38882fc2fbb7 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:13.626339755 +0000 UTC m=+66.304606790 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert") pod "ingress-canary-jp7bd" (UID: "3e10e1c4-64b7-4da9-be2e-38882fc2fbb7") : secret "canary-serving-cert" not found Apr 20 20:12:07.085678 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:07.085643 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zg9wv" Apr 20 20:12:13.534793 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.534752 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:12:13.535219 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.534816 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:12:13.535219 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:13.534932 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:12:13.535219 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:13.534944 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f9bf74767-6pq9d: secret "image-registry-tls" not found Apr 20 20:12:13.535219 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:13.535021 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls podName:4764d884-9129-4d18-82c6-332686e75ae2 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:45.535002167 +0000 UTC m=+98.213269219 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls") pod "image-registry-5f9bf74767-6pq9d" (UID: "4764d884-9129-4d18-82c6-332686e75ae2") : secret "image-registry-tls" not found Apr 20 20:12:13.538020 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.537998 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 20:12:13.549413 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.549387 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/06fca970-a9c2-455e-87f0-219af27b7cfd-original-pull-secret\") pod \"global-pull-secret-syncer-2cxr4\" (UID: \"06fca970-a9c2-455e-87f0-219af27b7cfd\") " pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:12:13.635780 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.635755 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert\") pod \"ingress-canary-jp7bd\" (UID: \"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7\") " pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:12:13.635876 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.635816 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:12:13.635876 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.635834 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs\") pod \"network-metrics-daemon-zc8kt\" (UID: \"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d\") " pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:12:13.635976 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:13.635885 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:12:13.635976 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:13.635948 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert podName:3e10e1c4-64b7-4da9-be2e-38882fc2fbb7 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:45.635932423 +0000 UTC m=+98.314199458 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert") pod "ingress-canary-jp7bd" (UID: "3e10e1c4-64b7-4da9-be2e-38882fc2fbb7") : secret "canary-serving-cert" not found Apr 20 20:12:13.636080 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:13.636008 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:12:13.636133 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:13.636087 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls podName:14b4f842-f248-42a7-8dcd-51dcb4e2ab36 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:45.636067947 +0000 UTC m=+98.314335001 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls") pod "dns-default-8zpg6" (UID: "14b4f842-f248-42a7-8dcd-51dcb4e2ab36") : secret "dns-default-metrics-tls" not found Apr 20 20:12:13.638059 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.638042 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:12:13.646616 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:13.646601 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:12:13.646666 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:13.646654 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs podName:a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d nodeName:}" failed. No retries permitted until 2026-04-20 20:13:17.646641729 +0000 UTC m=+130.324908770 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs") pod "network-metrics-daemon-zc8kt" (UID: "a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d") : secret "metrics-daemon-secret" not found Apr 20 20:12:13.736305 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.736282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fn7\" (UniqueName: \"kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7\") pod \"network-check-target-g75rf\" (UID: \"c5e48953-ddd6-4f02-b2d7-7a42dede535f\") " pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:12:13.739379 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.739363 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:12:13.748990 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.748971 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:12:13.759550 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.759525 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2fn7\" (UniqueName: \"kubernetes.io/projected/c5e48953-ddd6-4f02-b2d7-7a42dede535f-kube-api-access-r2fn7\") pod \"network-check-target-g75rf\" (UID: \"c5e48953-ddd6-4f02-b2d7-7a42dede535f\") " pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:12:13.801547 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.801498 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2cxr4" Apr 20 20:12:13.808565 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.808545 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-698bz\"" Apr 20 20:12:13.816714 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.816694 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:12:13.963831 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.963803 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2cxr4"] Apr 20 20:12:13.966881 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:12:13.966857 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06fca970_a9c2_455e_87f0_219af27b7cfd.slice/crio-ea041a6a1dee7ee631bf2c702ae29db7d0f727482048d815f3e58f1a7bb77eae WatchSource:0}: Error finding container ea041a6a1dee7ee631bf2c702ae29db7d0f727482048d815f3e58f1a7bb77eae: Status 404 returned error can't find the container with id ea041a6a1dee7ee631bf2c702ae29db7d0f727482048d815f3e58f1a7bb77eae Apr 20 20:12:13.978011 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:13.977988 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-g75rf"] Apr 20 20:12:13.980908 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:12:13.980886 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5e48953_ddd6_4f02_b2d7_7a42dede535f.slice/crio-956a7ed66c5a55bb82a40d9f1ff880cfeaae9cffd92fa21948caa7ea0d8d6c30 WatchSource:0}: Error finding container 956a7ed66c5a55bb82a40d9f1ff880cfeaae9cffd92fa21948caa7ea0d8d6c30: Status 404 returned error can't find the container with id 956a7ed66c5a55bb82a40d9f1ff880cfeaae9cffd92fa21948caa7ea0d8d6c30 Apr 20 20:12:14.152517 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:14.152463 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-g75rf" event={"ID":"c5e48953-ddd6-4f02-b2d7-7a42dede535f","Type":"ContainerStarted","Data":"956a7ed66c5a55bb82a40d9f1ff880cfeaae9cffd92fa21948caa7ea0d8d6c30"} Apr 20 20:12:14.153418 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:14.153399 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2cxr4" event={"ID":"06fca970-a9c2-455e-87f0-219af27b7cfd","Type":"ContainerStarted","Data":"ea041a6a1dee7ee631bf2c702ae29db7d0f727482048d815f3e58f1a7bb77eae"} Apr 20 20:12:19.165948 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:19.165905 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-g75rf" event={"ID":"c5e48953-ddd6-4f02-b2d7-7a42dede535f","Type":"ContainerStarted","Data":"3e0d32dcf8058218cfcdab8c9797f029e6ff4bfc801649c84085c3e65e4505d8"} Apr 20 20:12:19.166451 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:19.166052 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:12:19.167413 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:19.167387 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2cxr4" event={"ID":"06fca970-a9c2-455e-87f0-219af27b7cfd","Type":"ContainerStarted","Data":"3ef1715289967ae8a54cde5cadef335dd533ff33c54c3f961f7be96ab9956c10"} Apr 20 20:12:19.180802 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:19.180752 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-g75rf" podStartSLOduration=66.922030942 podStartE2EDuration="1m11.180739144s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:12:13.982620059 +0000 UTC m=+66.660887094" lastFinishedPulling="2026-04-20 20:12:18.241328258 +0000 UTC m=+70.919595296" observedRunningTime="2026-04-20 20:12:19.180284586 +0000 UTC m=+71.858551637" watchObservedRunningTime="2026-04-20 20:12:19.180739144 +0000 UTC m=+71.859006201" Apr 20 20:12:19.193687 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:19.193646 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-2cxr4" podStartSLOduration=66.915489702 podStartE2EDuration="1m11.193634069s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:12:13.968609954 +0000 UTC m=+66.646876998" lastFinishedPulling="2026-04-20 20:12:18.24675433 +0000 UTC m=+70.925021365" observedRunningTime="2026-04-20 20:12:19.192696524 +0000 UTC m=+71.870963581" watchObservedRunningTime="2026-04-20 20:12:19.193634069 +0000 UTC m=+71.871901126" Apr 20 20:12:28.773536 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.773413 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn"] Apr 20 20:12:28.776154 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.776137 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" Apr 20 20:12:28.778368 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.778339 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 20:12:28.778600 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.778565 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 20:12:28.779569 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.779551 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 20:12:28.779663 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.779626 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 20:12:28.782738 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.782350 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn"] Apr 20 20:12:28.839370 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.839346 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/44b156d7-2185-4ab8-b1df-387f43bb23bf-tmp\") pod \"klusterlet-addon-workmgr-6fbc579dfc-z2nsn\" (UID: \"44b156d7-2185-4ab8-b1df-387f43bb23bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" Apr 20 20:12:28.839476 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.839398 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/44b156d7-2185-4ab8-b1df-387f43bb23bf-klusterlet-config\") pod \"klusterlet-addon-workmgr-6fbc579dfc-z2nsn\" (UID: \"44b156d7-2185-4ab8-b1df-387f43bb23bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" Apr 20 20:12:28.839540 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.839465 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jpfh\" (UniqueName: \"kubernetes.io/projected/44b156d7-2185-4ab8-b1df-387f43bb23bf-kube-api-access-2jpfh\") pod \"klusterlet-addon-workmgr-6fbc579dfc-z2nsn\" (UID: \"44b156d7-2185-4ab8-b1df-387f43bb23bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" Apr 20 20:12:28.940229 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.940196 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/44b156d7-2185-4ab8-b1df-387f43bb23bf-tmp\") pod \"klusterlet-addon-workmgr-6fbc579dfc-z2nsn\" (UID: \"44b156d7-2185-4ab8-b1df-387f43bb23bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" Apr 20 20:12:28.940335 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.940256 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/44b156d7-2185-4ab8-b1df-387f43bb23bf-klusterlet-config\") pod \"klusterlet-addon-workmgr-6fbc579dfc-z2nsn\" (UID: \"44b156d7-2185-4ab8-b1df-387f43bb23bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" Apr 20 20:12:28.940396 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.940356 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jpfh\" (UniqueName: \"kubernetes.io/projected/44b156d7-2185-4ab8-b1df-387f43bb23bf-kube-api-access-2jpfh\") pod \"klusterlet-addon-workmgr-6fbc579dfc-z2nsn\" (UID: \"44b156d7-2185-4ab8-b1df-387f43bb23bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" Apr 20 20:12:28.940629 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.940609 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/44b156d7-2185-4ab8-b1df-387f43bb23bf-tmp\") pod \"klusterlet-addon-workmgr-6fbc579dfc-z2nsn\" (UID: \"44b156d7-2185-4ab8-b1df-387f43bb23bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" Apr 20 20:12:28.943645 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.943628 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/44b156d7-2185-4ab8-b1df-387f43bb23bf-klusterlet-config\") pod \"klusterlet-addon-workmgr-6fbc579dfc-z2nsn\" (UID: \"44b156d7-2185-4ab8-b1df-387f43bb23bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" Apr 20 20:12:28.949172 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:28.949152 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jpfh\" (UniqueName: \"kubernetes.io/projected/44b156d7-2185-4ab8-b1df-387f43bb23bf-kube-api-access-2jpfh\") pod \"klusterlet-addon-workmgr-6fbc579dfc-z2nsn\" (UID: \"44b156d7-2185-4ab8-b1df-387f43bb23bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" Apr 20 20:12:29.085338 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:29.085276 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" Apr 20 20:12:29.197504 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:29.197478 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn"] Apr 20 20:12:29.200659 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:12:29.200627 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44b156d7_2185_4ab8_b1df_387f43bb23bf.slice/crio-839fd071767c6b9dd2861f13729999d7e61e38f80ce951c3ae1cf45e30c5c078 WatchSource:0}: Error finding container 839fd071767c6b9dd2861f13729999d7e61e38f80ce951c3ae1cf45e30c5c078: Status 404 returned error can't find the container with id 839fd071767c6b9dd2861f13729999d7e61e38f80ce951c3ae1cf45e30c5c078 Apr 20 20:12:30.190000 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:30.189958 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" event={"ID":"44b156d7-2185-4ab8-b1df-387f43bb23bf","Type":"ContainerStarted","Data":"839fd071767c6b9dd2861f13729999d7e61e38f80ce951c3ae1cf45e30c5c078"} Apr 20 20:12:34.201083 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:34.201047 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" event={"ID":"44b156d7-2185-4ab8-b1df-387f43bb23bf","Type":"ContainerStarted","Data":"f3cdd1212e2bfa2dd8523be8cc0907dc958555976e1c9b380f033a9ea77a1537"} Apr 20 20:12:34.201430 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:34.201250 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" Apr 20 20:12:34.202926 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:34.202905 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" Apr 20 20:12:34.215161 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:34.215114 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fbc579dfc-z2nsn" podStartSLOduration=2.089895948 podStartE2EDuration="6.215103692s" podCreationTimestamp="2026-04-20 20:12:28 +0000 UTC" firstStartedPulling="2026-04-20 20:12:29.202148541 +0000 UTC m=+81.880415590" lastFinishedPulling="2026-04-20 20:12:33.327356294 +0000 UTC m=+86.005623334" observedRunningTime="2026-04-20 20:12:34.214733904 +0000 UTC m=+86.893000961" watchObservedRunningTime="2026-04-20 20:12:34.215103692 +0000 UTC m=+86.893370748" Apr 20 20:12:45.561771 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:45.561736 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:12:45.562192 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:45.561904 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:12:45.562192 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:45.561926 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f9bf74767-6pq9d: secret "image-registry-tls" not found Apr 20 20:12:45.562192 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:45.561998 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls podName:4764d884-9129-4d18-82c6-332686e75ae2 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:49.561981583 +0000 UTC m=+162.240248618 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls") pod "image-registry-5f9bf74767-6pq9d" (UID: "4764d884-9129-4d18-82c6-332686e75ae2") : secret "image-registry-tls" not found Apr 20 20:12:45.662931 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:45.662903 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:12:45.663023 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:45.662946 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert\") pod \"ingress-canary-jp7bd\" (UID: \"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7\") " pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:12:45.663061 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:45.663035 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:12:45.663094 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:45.663055 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:12:45.663094 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:45.663080 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert podName:3e10e1c4-64b7-4da9-be2e-38882fc2fbb7 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:49.663068234 +0000 UTC m=+162.341335269 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert") pod "ingress-canary-jp7bd" (UID: "3e10e1c4-64b7-4da9-be2e-38882fc2fbb7") : secret "canary-serving-cert" not found Apr 20 20:12:45.663163 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:12:45.663118 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls podName:14b4f842-f248-42a7-8dcd-51dcb4e2ab36 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:49.663104501 +0000 UTC m=+162.341371541 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls") pod "dns-default-8zpg6" (UID: "14b4f842-f248-42a7-8dcd-51dcb4e2ab36") : secret "dns-default-metrics-tls" not found Apr 20 20:12:50.172028 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:50.171994 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-g75rf" Apr 20 20:12:57.460032 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.459995 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8hmpw"] Apr 20 20:12:57.462808 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.462790 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.465426 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.465401 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 20:12:57.465426 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.465416 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 20:12:57.465623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.465401 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 20:12:57.465623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.465445 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 20:12:57.466686 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.466664 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-gjgnw\"" Apr 20 20:12:57.471784 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.471761 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8hmpw"] Apr 20 20:12:57.471972 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.471955 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 20:12:57.541019 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.540996 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/210a9def-653b-4320-b877-97fe51dc9677-service-ca-bundle\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.541145 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.541045 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/210a9def-653b-4320-b877-97fe51dc9677-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.541145 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.541079 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqqqp\" (UniqueName: \"kubernetes.io/projected/210a9def-653b-4320-b877-97fe51dc9677-kube-api-access-hqqqp\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.541145 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.541115 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/210a9def-653b-4320-b877-97fe51dc9677-tmp\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.541145 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.541139 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/210a9def-653b-4320-b877-97fe51dc9677-snapshots\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.541287 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.541217 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210a9def-653b-4320-b877-97fe51dc9677-serving-cert\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.641450 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.641424 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/210a9def-653b-4320-b877-97fe51dc9677-service-ca-bundle\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.641540 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.641462 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/210a9def-653b-4320-b877-97fe51dc9677-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.641540 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.641484 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqqqp\" (UniqueName: \"kubernetes.io/projected/210a9def-653b-4320-b877-97fe51dc9677-kube-api-access-hqqqp\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.641540 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.641520 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/210a9def-653b-4320-b877-97fe51dc9677-tmp\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.641540 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.641535 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/210a9def-653b-4320-b877-97fe51dc9677-snapshots\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.641761 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.641594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210a9def-653b-4320-b877-97fe51dc9677-serving-cert\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.641963 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.641934 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/210a9def-653b-4320-b877-97fe51dc9677-tmp\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.642212 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.642172 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/210a9def-653b-4320-b877-97fe51dc9677-service-ca-bundle\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.642340 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.642309 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/210a9def-653b-4320-b877-97fe51dc9677-snapshots\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.642457 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.642433 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/210a9def-653b-4320-b877-97fe51dc9677-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.644092 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.644072 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210a9def-653b-4320-b877-97fe51dc9677-serving-cert\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.650029 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.650007 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqqqp\" (UniqueName: \"kubernetes.io/projected/210a9def-653b-4320-b877-97fe51dc9677-kube-api-access-hqqqp\") pod \"insights-operator-585dfdc468-8hmpw\" (UID: \"210a9def-653b-4320-b877-97fe51dc9677\") " pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.773438 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.773380 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-8hmpw" Apr 20 20:12:57.889507 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:57.889485 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8hmpw"] Apr 20 20:12:57.891166 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:12:57.891140 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod210a9def_653b_4320_b877_97fe51dc9677.slice/crio-515eb310ace08584ed73d642ca5286b9aeb2857804bfba65c5509c4acf79b824 WatchSource:0}: Error finding container 515eb310ace08584ed73d642ca5286b9aeb2857804bfba65c5509c4acf79b824: Status 404 returned error can't find the container with id 515eb310ace08584ed73d642ca5286b9aeb2857804bfba65c5509c4acf79b824 Apr 20 20:12:58.246271 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:12:58.246231 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8hmpw" event={"ID":"210a9def-653b-4320-b877-97fe51dc9677","Type":"ContainerStarted","Data":"515eb310ace08584ed73d642ca5286b9aeb2857804bfba65c5509c4acf79b824"} Apr 20 20:13:00.251262 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:00.251228 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8hmpw" event={"ID":"210a9def-653b-4320-b877-97fe51dc9677","Type":"ContainerStarted","Data":"2c9b0330db68ca4d842c8b079f7b603a2f5d3198be971e41ac35b58e7bec9988"} Apr 20 20:13:00.271031 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:00.270989 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-8hmpw" podStartSLOduration=1.324566092 podStartE2EDuration="3.270976291s" podCreationTimestamp="2026-04-20 20:12:57 +0000 UTC" firstStartedPulling="2026-04-20 20:12:57.892872279 +0000 UTC m=+110.571139315" lastFinishedPulling="2026-04-20 20:12:59.839282474 +0000 UTC m=+112.517549514" observedRunningTime="2026-04-20 20:13:00.270624038 +0000 UTC m=+112.948891096" watchObservedRunningTime="2026-04-20 20:13:00.270976291 +0000 UTC m=+112.949243347" Apr 20 20:13:03.227872 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:03.227836 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nd6kq_3246feba-bd0f-4518-845a-c9025f63125b/dns-node-resolver/0.log" Apr 20 20:13:04.228147 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:04.228113 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xq679_b5e335ef-aa11-4058-9448-d253b1a467c3/node-ca/0.log" Apr 20 20:13:07.476643 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:07.476610 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l"] Apr 20 20:13:07.480799 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:07.480784 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" Apr 20 20:13:07.483120 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:07.483097 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 20:13:07.483120 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:07.483114 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 20:13:07.483299 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:07.483117 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:13:07.483299 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:07.483197 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-74gdp\"" Apr 20 20:13:07.490451 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:07.490428 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l"] Apr 20 20:13:07.606159 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:07.606130 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-klc5l\" (UID: \"fed047fd-50e2-4302-ad67-06a2bb9f89c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" Apr 20 20:13:07.606280 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:07.606161 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hjhd\" (UniqueName: \"kubernetes.io/projected/fed047fd-50e2-4302-ad67-06a2bb9f89c7-kube-api-access-8hjhd\") pod \"cluster-samples-operator-6dc5bdb6b4-klc5l\" (UID: \"fed047fd-50e2-4302-ad67-06a2bb9f89c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" Apr 20 20:13:07.706917 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:07.706889 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-klc5l\" (UID: \"fed047fd-50e2-4302-ad67-06a2bb9f89c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" Apr 20 20:13:07.707019 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:07.706922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hjhd\" (UniqueName: \"kubernetes.io/projected/fed047fd-50e2-4302-ad67-06a2bb9f89c7-kube-api-access-8hjhd\") pod \"cluster-samples-operator-6dc5bdb6b4-klc5l\" (UID: \"fed047fd-50e2-4302-ad67-06a2bb9f89c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" Apr 20 20:13:07.707076 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:07.707031 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:13:07.707130 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:07.707093 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls podName:fed047fd-50e2-4302-ad67-06a2bb9f89c7 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:08.207080054 +0000 UTC m=+120.885347095 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-klc5l" (UID: "fed047fd-50e2-4302-ad67-06a2bb9f89c7") : secret "samples-operator-tls" not found Apr 20 20:13:07.715653 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:07.715627 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hjhd\" (UniqueName: \"kubernetes.io/projected/fed047fd-50e2-4302-ad67-06a2bb9f89c7-kube-api-access-8hjhd\") pod \"cluster-samples-operator-6dc5bdb6b4-klc5l\" (UID: \"fed047fd-50e2-4302-ad67-06a2bb9f89c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" Apr 20 20:13:08.210190 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:08.210149 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-klc5l\" (UID: \"fed047fd-50e2-4302-ad67-06a2bb9f89c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" Apr 20 20:13:08.210343 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:08.210258 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:13:08.210343 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:08.210323 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls podName:fed047fd-50e2-4302-ad67-06a2bb9f89c7 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:09.210306439 +0000 UTC m=+121.888573474 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-klc5l" (UID: "fed047fd-50e2-4302-ad67-06a2bb9f89c7") : secret "samples-operator-tls" not found Apr 20 20:13:09.217653 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:09.217615 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-klc5l\" (UID: \"fed047fd-50e2-4302-ad67-06a2bb9f89c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" Apr 20 20:13:09.218127 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:09.217750 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:13:09.218127 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:09.217829 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls podName:fed047fd-50e2-4302-ad67-06a2bb9f89c7 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:11.217807268 +0000 UTC m=+123.896074309 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-klc5l" (UID: "fed047fd-50e2-4302-ad67-06a2bb9f89c7") : secret "samples-operator-tls" not found Apr 20 20:13:11.230941 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:11.230902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-klc5l\" (UID: \"fed047fd-50e2-4302-ad67-06a2bb9f89c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" Apr 20 20:13:11.231353 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:11.231044 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:13:11.231353 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:11.231113 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls podName:fed047fd-50e2-4302-ad67-06a2bb9f89c7 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:15.231097134 +0000 UTC m=+127.909364169 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-klc5l" (UID: "fed047fd-50e2-4302-ad67-06a2bb9f89c7") : secret "samples-operator-tls" not found Apr 20 20:13:13.270379 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:13.270347 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-flvcg"] Apr 20 20:13:13.272382 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:13.272351 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flvcg" Apr 20 20:13:13.275092 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:13.275069 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 20:13:13.276049 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:13.276030 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-tz5hj\"" Apr 20 20:13:13.276129 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:13.276031 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 20:13:13.281741 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:13.281719 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-flvcg"] Apr 20 20:13:13.348246 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:13.348221 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp5sj\" (UniqueName: \"kubernetes.io/projected/27f6d5e5-7798-499d-a755-3e7bfa905be0-kube-api-access-bp5sj\") pod \"migrator-74bb7799d9-flvcg\" (UID: \"27f6d5e5-7798-499d-a755-3e7bfa905be0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flvcg" Apr 20 20:13:13.449269 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:13.449237 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bp5sj\" (UniqueName: \"kubernetes.io/projected/27f6d5e5-7798-499d-a755-3e7bfa905be0-kube-api-access-bp5sj\") pod \"migrator-74bb7799d9-flvcg\" (UID: \"27f6d5e5-7798-499d-a755-3e7bfa905be0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flvcg" Apr 20 20:13:13.459611 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:13.459591 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp5sj\" (UniqueName: \"kubernetes.io/projected/27f6d5e5-7798-499d-a755-3e7bfa905be0-kube-api-access-bp5sj\") pod \"migrator-74bb7799d9-flvcg\" (UID: \"27f6d5e5-7798-499d-a755-3e7bfa905be0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flvcg" Apr 20 20:13:13.580452 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:13.580406 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flvcg" Apr 20 20:13:13.690550 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:13.690514 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-flvcg"] Apr 20 20:13:13.693426 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:13:13.693402 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27f6d5e5_7798_499d_a755_3e7bfa905be0.slice/crio-62f24343124b14f114d8a4b8ae212e6b93a00cfa79f2c65fd35349115e83bed8 WatchSource:0}: Error finding container 62f24343124b14f114d8a4b8ae212e6b93a00cfa79f2c65fd35349115e83bed8: Status 404 returned error can't find the container with id 62f24343124b14f114d8a4b8ae212e6b93a00cfa79f2c65fd35349115e83bed8 Apr 20 20:13:14.274948 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.274910 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flvcg" event={"ID":"27f6d5e5-7798-499d-a755-3e7bfa905be0","Type":"ContainerStarted","Data":"62f24343124b14f114d8a4b8ae212e6b93a00cfa79f2c65fd35349115e83bed8"} Apr 20 20:13:14.663146 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.663120 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rpr4q"] Apr 20 20:13:14.665133 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.665111 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:14.667561 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.667538 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 20:13:14.667765 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.667545 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 20:13:14.667765 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.667545 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-stvsn\"" Apr 20 20:13:14.676950 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.676927 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rpr4q"] Apr 20 20:13:14.758722 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.758696 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwkf2\" (UniqueName: \"kubernetes.io/projected/84d2f547-e0e6-4189-b17f-bfe907634d26-kube-api-access-dwkf2\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:14.758828 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.758746 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/84d2f547-e0e6-4189-b17f-bfe907634d26-crio-socket\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:14.758828 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.758819 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/84d2f547-e0e6-4189-b17f-bfe907634d26-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:14.758901 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.758848 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:14.758901 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.758884 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/84d2f547-e0e6-4189-b17f-bfe907634d26-data-volume\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:14.860000 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.859974 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:14.860104 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.860015 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/84d2f547-e0e6-4189-b17f-bfe907634d26-data-volume\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:14.860104 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.860056 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwkf2\" (UniqueName: \"kubernetes.io/projected/84d2f547-e0e6-4189-b17f-bfe907634d26-kube-api-access-dwkf2\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:14.860104 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.860087 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/84d2f547-e0e6-4189-b17f-bfe907634d26-crio-socket\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:14.860242 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.860109 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/84d2f547-e0e6-4189-b17f-bfe907634d26-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:14.860242 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:14.860143 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 20:13:14.860242 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.860223 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/84d2f547-e0e6-4189-b17f-bfe907634d26-crio-socket\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:14.860356 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:14.860232 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls podName:84d2f547-e0e6-4189-b17f-bfe907634d26 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:15.360209925 +0000 UTC m=+128.038476966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls") pod "insights-runtime-extractor-rpr4q" (UID: "84d2f547-e0e6-4189-b17f-bfe907634d26") : secret "insights-runtime-extractor-tls" not found Apr 20 20:13:14.860397 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.860350 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/84d2f547-e0e6-4189-b17f-bfe907634d26-data-volume\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:14.860515 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.860499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/84d2f547-e0e6-4189-b17f-bfe907634d26-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:14.868133 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.868110 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwkf2\" (UniqueName: \"kubernetes.io/projected/84d2f547-e0e6-4189-b17f-bfe907634d26-kube-api-access-dwkf2\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:14.948266 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.948217 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zlw4l"] Apr 20 20:13:14.950093 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.950080 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-zlw4l" Apr 20 20:13:14.952547 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.952524 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 20:13:14.952547 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.952527 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-cw8c5\"" Apr 20 20:13:14.952753 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.952547 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 20:13:14.952753 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.952607 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 20:13:14.952753 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.952651 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 20:13:14.957307 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:14.957276 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zlw4l"] Apr 20 20:13:15.061573 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:15.061543 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6f3f27b1-5436-4973-bbcf-d9635154bc74-signing-cabundle\") pod \"service-ca-865cb79987-zlw4l\" (UID: \"6f3f27b1-5436-4973-bbcf-d9635154bc74\") " pod="openshift-service-ca/service-ca-865cb79987-zlw4l" Apr 20 20:13:15.061749 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:15.061610 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bg77\" (UniqueName: \"kubernetes.io/projected/6f3f27b1-5436-4973-bbcf-d9635154bc74-kube-api-access-2bg77\") pod \"service-ca-865cb79987-zlw4l\" (UID: \"6f3f27b1-5436-4973-bbcf-d9635154bc74\") " pod="openshift-service-ca/service-ca-865cb79987-zlw4l" Apr 20 20:13:15.061749 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:15.061664 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6f3f27b1-5436-4973-bbcf-d9635154bc74-signing-key\") pod \"service-ca-865cb79987-zlw4l\" (UID: \"6f3f27b1-5436-4973-bbcf-d9635154bc74\") " pod="openshift-service-ca/service-ca-865cb79987-zlw4l" Apr 20 20:13:15.163131 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:15.163094 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bg77\" (UniqueName: \"kubernetes.io/projected/6f3f27b1-5436-4973-bbcf-d9635154bc74-kube-api-access-2bg77\") pod \"service-ca-865cb79987-zlw4l\" (UID: \"6f3f27b1-5436-4973-bbcf-d9635154bc74\") " pod="openshift-service-ca/service-ca-865cb79987-zlw4l" Apr 20 20:13:15.163302 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:15.163211 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6f3f27b1-5436-4973-bbcf-d9635154bc74-signing-key\") pod \"service-ca-865cb79987-zlw4l\" (UID: \"6f3f27b1-5436-4973-bbcf-d9635154bc74\") " pod="openshift-service-ca/service-ca-865cb79987-zlw4l" Apr 20 20:13:15.163369 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:15.163308 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6f3f27b1-5436-4973-bbcf-d9635154bc74-signing-cabundle\") pod \"service-ca-865cb79987-zlw4l\" (UID: \"6f3f27b1-5436-4973-bbcf-d9635154bc74\") " pod="openshift-service-ca/service-ca-865cb79987-zlw4l" Apr 20 20:13:15.164065 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:15.164033 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6f3f27b1-5436-4973-bbcf-d9635154bc74-signing-cabundle\") pod \"service-ca-865cb79987-zlw4l\" (UID: \"6f3f27b1-5436-4973-bbcf-d9635154bc74\") " pod="openshift-service-ca/service-ca-865cb79987-zlw4l" Apr 20 20:13:15.166154 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:15.166128 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6f3f27b1-5436-4973-bbcf-d9635154bc74-signing-key\") pod \"service-ca-865cb79987-zlw4l\" (UID: \"6f3f27b1-5436-4973-bbcf-d9635154bc74\") " pod="openshift-service-ca/service-ca-865cb79987-zlw4l" Apr 20 20:13:15.171421 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:15.171400 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bg77\" (UniqueName: \"kubernetes.io/projected/6f3f27b1-5436-4973-bbcf-d9635154bc74-kube-api-access-2bg77\") pod \"service-ca-865cb79987-zlw4l\" (UID: \"6f3f27b1-5436-4973-bbcf-d9635154bc74\") " pod="openshift-service-ca/service-ca-865cb79987-zlw4l" Apr 20 20:13:15.258933 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:15.258917 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-zlw4l" Apr 20 20:13:15.264393 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:15.264363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-klc5l\" (UID: \"fed047fd-50e2-4302-ad67-06a2bb9f89c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" Apr 20 20:13:15.264608 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:15.264506 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:13:15.264608 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:15.264558 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls podName:fed047fd-50e2-4302-ad67-06a2bb9f89c7 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:23.264542685 +0000 UTC m=+135.942809728 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-klc5l" (UID: "fed047fd-50e2-4302-ad67-06a2bb9f89c7") : secret "samples-operator-tls" not found Apr 20 20:13:15.278927 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:15.278902 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flvcg" event={"ID":"27f6d5e5-7798-499d-a755-3e7bfa905be0","Type":"ContainerStarted","Data":"3aa4979e8aa35cb3f907eac27c2f0a53fc1f346a89c3c1fd5d9b9dc7c3c98044"} Apr 20 20:13:15.365348 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:15.365319 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:15.365540 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:15.365464 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 20:13:15.365540 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:15.365531 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls podName:84d2f547-e0e6-4189-b17f-bfe907634d26 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:16.365511312 +0000 UTC m=+129.043778347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls") pod "insights-runtime-extractor-rpr4q" (UID: "84d2f547-e0e6-4189-b17f-bfe907634d26") : secret "insights-runtime-extractor-tls" not found Apr 20 20:13:15.377103 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:15.377085 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zlw4l"] Apr 20 20:13:15.379314 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:13:15.379287 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f3f27b1_5436_4973_bbcf_d9635154bc74.slice/crio-9e58ab91c6de6dfa50bf15900448ca2803a1c8ed1e945865eb58ec4db93ffc98 WatchSource:0}: Error finding container 9e58ab91c6de6dfa50bf15900448ca2803a1c8ed1e945865eb58ec4db93ffc98: Status 404 returned error can't find the container with id 9e58ab91c6de6dfa50bf15900448ca2803a1c8ed1e945865eb58ec4db93ffc98 Apr 20 20:13:16.282680 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:16.282647 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-zlw4l" event={"ID":"6f3f27b1-5436-4973-bbcf-d9635154bc74","Type":"ContainerStarted","Data":"9e58ab91c6de6dfa50bf15900448ca2803a1c8ed1e945865eb58ec4db93ffc98"} Apr 20 20:13:16.284191 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:16.284163 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flvcg" event={"ID":"27f6d5e5-7798-499d-a755-3e7bfa905be0","Type":"ContainerStarted","Data":"ffde6a4f668061b9c3a883bc55eca227f0eba9d5e67e615ea48f33327059f43b"} Apr 20 20:13:16.374233 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:16.374205 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:16.374391 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:16.374365 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 20:13:16.374454 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:16.374434 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls podName:84d2f547-e0e6-4189-b17f-bfe907634d26 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:18.374415266 +0000 UTC m=+131.052682316 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls") pod "insights-runtime-extractor-rpr4q" (UID: "84d2f547-e0e6-4189-b17f-bfe907634d26") : secret "insights-runtime-extractor-tls" not found Apr 20 20:13:17.689808 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:17.689733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs\") pod \"network-metrics-daemon-zc8kt\" (UID: \"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d\") " pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:13:17.690143 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:17.689885 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:13:17.690143 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:17.689956 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs podName:a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d nodeName:}" failed. No retries permitted until 2026-04-20 20:15:19.689929084 +0000 UTC m=+252.368196121 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs") pod "network-metrics-daemon-zc8kt" (UID: "a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d") : secret "metrics-daemon-secret" not found Apr 20 20:13:17.902394 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:17.902354 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flvcg" podStartSLOduration=3.3890903740000002 podStartE2EDuration="4.902340009s" podCreationTimestamp="2026-04-20 20:13:13 +0000 UTC" firstStartedPulling="2026-04-20 20:13:13.695228611 +0000 UTC m=+126.373495645" lastFinishedPulling="2026-04-20 20:13:15.208478242 +0000 UTC m=+127.886745280" observedRunningTime="2026-04-20 20:13:16.30099671 +0000 UTC m=+128.979263767" watchObservedRunningTime="2026-04-20 20:13:17.902340009 +0000 UTC m=+130.580607049" Apr 20 20:13:18.290565 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:18.290534 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-zlw4l" event={"ID":"6f3f27b1-5436-4973-bbcf-d9635154bc74","Type":"ContainerStarted","Data":"44dd829ba68efcdb8cc5901f8f2aed9676ee9061e22925d0d257be9bdc74bc2e"} Apr 20 20:13:18.305320 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:18.305286 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-zlw4l" podStartSLOduration=2.270058719 podStartE2EDuration="4.305273382s" podCreationTimestamp="2026-04-20 20:13:14 +0000 UTC" firstStartedPulling="2026-04-20 20:13:15.381146589 +0000 UTC m=+128.059413624" lastFinishedPulling="2026-04-20 20:13:17.416361249 +0000 UTC m=+130.094628287" observedRunningTime="2026-04-20 20:13:18.303677698 +0000 UTC m=+130.981944759" watchObservedRunningTime="2026-04-20 20:13:18.305273382 +0000 UTC m=+130.983540469" Apr 20 20:13:18.394232 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:18.394202 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:18.394401 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:18.394382 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 20:13:18.394462 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:18.394451 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls podName:84d2f547-e0e6-4189-b17f-bfe907634d26 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:22.394435035 +0000 UTC m=+135.072702090 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls") pod "insights-runtime-extractor-rpr4q" (UID: "84d2f547-e0e6-4189-b17f-bfe907634d26") : secret "insights-runtime-extractor-tls" not found Apr 20 20:13:22.425480 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:22.425451 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:22.425914 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:22.425608 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 20:13:22.425914 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:22.425679 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls podName:84d2f547-e0e6-4189-b17f-bfe907634d26 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:30.425659798 +0000 UTC m=+143.103926842 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls") pod "insights-runtime-extractor-rpr4q" (UID: "84d2f547-e0e6-4189-b17f-bfe907634d26") : secret "insights-runtime-extractor-tls" not found Apr 20 20:13:23.332358 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:23.332330 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-klc5l\" (UID: \"fed047fd-50e2-4302-ad67-06a2bb9f89c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" Apr 20 20:13:23.334662 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:23.334643 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fed047fd-50e2-4302-ad67-06a2bb9f89c7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-klc5l\" (UID: \"fed047fd-50e2-4302-ad67-06a2bb9f89c7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" Apr 20 20:13:23.391867 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:23.391843 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-74gdp\"" Apr 20 20:13:23.400133 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:23.400106 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" Apr 20 20:13:23.526301 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:23.526269 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l"] Apr 20 20:13:24.306300 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:24.306263 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" event={"ID":"fed047fd-50e2-4302-ad67-06a2bb9f89c7","Type":"ContainerStarted","Data":"3794ef59a215ce0a2cf802069e6844a9d9e937963ed0aa7eb0885e20a2efeec9"} Apr 20 20:13:25.311560 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:25.310986 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" event={"ID":"fed047fd-50e2-4302-ad67-06a2bb9f89c7","Type":"ContainerStarted","Data":"bcfada20751e81fbacdbd2fe516cb142cc7af314f1bcd0ba758dd183b07dab10"} Apr 20 20:13:25.311560 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:25.311025 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" event={"ID":"fed047fd-50e2-4302-ad67-06a2bb9f89c7","Type":"ContainerStarted","Data":"469fa464fc787011be5d5b5b7a58cd7639a5a16dc5ac8bcfe575b02a7edab810"} Apr 20 20:13:25.328167 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:25.328125 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-klc5l" podStartSLOduration=16.713603236 podStartE2EDuration="18.328108456s" podCreationTimestamp="2026-04-20 20:13:07 +0000 UTC" firstStartedPulling="2026-04-20 20:13:23.566387824 +0000 UTC m=+136.244654858" lastFinishedPulling="2026-04-20 20:13:25.180893044 +0000 UTC m=+137.859160078" observedRunningTime="2026-04-20 20:13:25.326905496 +0000 UTC m=+138.005172557" watchObservedRunningTime="2026-04-20 20:13:25.328108456 +0000 UTC m=+138.006375511" Apr 20 20:13:30.489253 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:30.489217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:30.492013 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:30.491991 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/84d2f547-e0e6-4189-b17f-bfe907634d26-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpr4q\" (UID: \"84d2f547-e0e6-4189-b17f-bfe907634d26\") " pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:30.577627 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:30.577602 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rpr4q" Apr 20 20:13:30.697503 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:30.697478 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rpr4q"] Apr 20 20:13:30.700511 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:13:30.700473 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d2f547_e0e6_4189_b17f_bfe907634d26.slice/crio-06b5fa63849c6c619a47b91fbb785f64927e9840ee1d700595ab06c498f0fe82 WatchSource:0}: Error finding container 06b5fa63849c6c619a47b91fbb785f64927e9840ee1d700595ab06c498f0fe82: Status 404 returned error can't find the container with id 06b5fa63849c6c619a47b91fbb785f64927e9840ee1d700595ab06c498f0fe82 Apr 20 20:13:31.325795 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:31.325762 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpr4q" event={"ID":"84d2f547-e0e6-4189-b17f-bfe907634d26","Type":"ContainerStarted","Data":"a1b4304ec93157a46371f9255c4a41145bcdd8cdc852dda8755c07050261b1d9"} Apr 20 20:13:31.325795 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:31.325796 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpr4q" event={"ID":"84d2f547-e0e6-4189-b17f-bfe907634d26","Type":"ContainerStarted","Data":"06b5fa63849c6c619a47b91fbb785f64927e9840ee1d700595ab06c498f0fe82"} Apr 20 20:13:32.331732 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:32.331695 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpr4q" event={"ID":"84d2f547-e0e6-4189-b17f-bfe907634d26","Type":"ContainerStarted","Data":"2e047f5794acdd5a9972c6263c4124f66e11cf7e116e1f668776b105a6f2e8b8"} Apr 20 20:13:33.335549 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:33.335523 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpr4q" event={"ID":"84d2f547-e0e6-4189-b17f-bfe907634d26","Type":"ContainerStarted","Data":"8d8cd6a79a1eab9d0c54513957e0135e56838b2107f073d4a886fcbfdda54660"} Apr 20 20:13:33.359294 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:33.359220 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rpr4q" podStartSLOduration=17.019132665 podStartE2EDuration="19.359207257s" podCreationTimestamp="2026-04-20 20:13:14 +0000 UTC" firstStartedPulling="2026-04-20 20:13:30.75220297 +0000 UTC m=+143.430470005" lastFinishedPulling="2026-04-20 20:13:33.092277562 +0000 UTC m=+145.770544597" observedRunningTime="2026-04-20 20:13:33.359206949 +0000 UTC m=+146.037474005" watchObservedRunningTime="2026-04-20 20:13:33.359207257 +0000 UTC m=+146.037474311" Apr 20 20:13:42.598955 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.598921 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76b547796f-td5bz"] Apr 20 20:13:42.601125 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.601109 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.603699 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.603654 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 20:13:42.603699 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.603689 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 20:13:42.603859 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.603797 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 20:13:42.605168 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.605145 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 20:13:42.605257 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.605180 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 20:13:42.605257 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.605196 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 20:13:42.605257 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.605201 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-m57vs\"" Apr 20 20:13:42.605257 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.605244 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 20:13:42.610411 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.610391 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76b547796f-td5bz"] Apr 20 20:13:42.680707 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.680673 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-service-ca\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.680707 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.680712 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-oauth-config\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.680870 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.680741 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-serving-cert\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.680870 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.680801 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jtmc\" (UniqueName: \"kubernetes.io/projected/db0882fe-9e46-41fe-a40f-02b7855f16d1-kube-api-access-8jtmc\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.680870 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.680832 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-config\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.680870 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.680869 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-oauth-serving-cert\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.781540 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.781500 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-serving-cert\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.781685 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.781562 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jtmc\" (UniqueName: \"kubernetes.io/projected/db0882fe-9e46-41fe-a40f-02b7855f16d1-kube-api-access-8jtmc\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.781685 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.781622 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-config\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.781685 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.781664 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-oauth-serving-cert\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.781830 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.781701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-service-ca\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.781830 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.781733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-oauth-config\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.782360 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.782337 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-config\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.782457 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.782436 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-oauth-serving-cert\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.782496 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.782444 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-service-ca\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.784016 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.783994 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-serving-cert\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.784016 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.784004 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-oauth-config\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.789633 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.789613 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jtmc\" (UniqueName: \"kubernetes.io/projected/db0882fe-9e46-41fe-a40f-02b7855f16d1-kube-api-access-8jtmc\") pod \"console-76b547796f-td5bz\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:42.910561 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:42.910531 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:43.038782 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:43.038758 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76b547796f-td5bz"] Apr 20 20:13:43.040957 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:13:43.040912 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb0882fe_9e46_41fe_a40f_02b7855f16d1.slice/crio-f85cb3480073c69905ea03edab6c6e8267408c43c5c230336c3b620fefd94d03 WatchSource:0}: Error finding container f85cb3480073c69905ea03edab6c6e8267408c43c5c230336c3b620fefd94d03: Status 404 returned error can't find the container with id f85cb3480073c69905ea03edab6c6e8267408c43c5c230336c3b620fefd94d03 Apr 20 20:13:43.361234 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:43.361155 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b547796f-td5bz" event={"ID":"db0882fe-9e46-41fe-a40f-02b7855f16d1","Type":"ContainerStarted","Data":"f85cb3480073c69905ea03edab6c6e8267408c43c5c230336c3b620fefd94d03"} Apr 20 20:13:44.733837 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:44.733792 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" podUID="4764d884-9129-4d18-82c6-332686e75ae2" Apr 20 20:13:44.745938 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:44.745904 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-8zpg6" podUID="14b4f842-f248-42a7-8dcd-51dcb4e2ab36" Apr 20 20:13:44.759059 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:44.759023 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jp7bd" podUID="3e10e1c4-64b7-4da9-be2e-38882fc2fbb7" Apr 20 20:13:45.366446 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:45.366414 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8zpg6" Apr 20 20:13:45.895594 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:45.895536 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-zc8kt" podUID="a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d" Apr 20 20:13:46.854456 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.854420 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lzw7p"] Apr 20 20:13:46.857339 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.857321 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:46.860201 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.860042 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 20:13:46.860201 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.860057 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 20:13:46.860201 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.860077 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 20:13:46.860201 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.860091 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-c8f4q\"" Apr 20 20:13:46.860491 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.860353 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 20:13:46.861197 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.861179 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 20:13:46.861197 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.861195 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 20:13:46.922011 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.921984 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-tls\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:46.922334 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.922021 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dhjt\" (UniqueName: \"kubernetes.io/projected/71b72aed-15d6-4289-8b96-b55802ca0734-kube-api-access-5dhjt\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:46.922334 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.922042 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-wtmp\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:46.922334 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.922062 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:46.922334 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.922158 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/71b72aed-15d6-4289-8b96-b55802ca0734-sys\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:46.922334 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.922237 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/71b72aed-15d6-4289-8b96-b55802ca0734-metrics-client-ca\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:46.922334 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.922281 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-textfile\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:46.922334 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.922309 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/71b72aed-15d6-4289-8b96-b55802ca0734-root\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:46.922592 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:46.922334 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-accelerators-collector-config\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.023167 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.023141 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dhjt\" (UniqueName: \"kubernetes.io/projected/71b72aed-15d6-4289-8b96-b55802ca0734-kube-api-access-5dhjt\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.023285 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.023180 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-wtmp\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.023285 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.023211 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.023285 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.023247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/71b72aed-15d6-4289-8b96-b55802ca0734-sys\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.023435 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.023346 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/71b72aed-15d6-4289-8b96-b55802ca0734-sys\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.023435 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.023352 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-wtmp\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.023435 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.023416 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/71b72aed-15d6-4289-8b96-b55802ca0734-metrics-client-ca\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.023609 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.023474 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-textfile\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.023609 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.023505 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/71b72aed-15d6-4289-8b96-b55802ca0734-root\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.023609 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.023534 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-accelerators-collector-config\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.023754 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.023611 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-tls\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.023754 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:47.023740 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 20:13:47.023853 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.023792 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/71b72aed-15d6-4289-8b96-b55802ca0734-root\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.023853 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:47.023806 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-tls podName:71b72aed-15d6-4289-8b96-b55802ca0734 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:47.523787432 +0000 UTC m=+160.202054470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-tls") pod "node-exporter-lzw7p" (UID: "71b72aed-15d6-4289-8b96-b55802ca0734") : secret "node-exporter-tls" not found Apr 20 20:13:47.023853 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.023802 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-textfile\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.024292 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.024270 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-accelerators-collector-config\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.024375 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.024305 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/71b72aed-15d6-4289-8b96-b55802ca0734-metrics-client-ca\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.026137 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.026110 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.035956 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.035937 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dhjt\" (UniqueName: \"kubernetes.io/projected/71b72aed-15d6-4289-8b96-b55802ca0734-kube-api-access-5dhjt\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.374900 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.374826 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b547796f-td5bz" event={"ID":"db0882fe-9e46-41fe-a40f-02b7855f16d1","Type":"ContainerStarted","Data":"61ba0b7682e2adebb7752467c1e462d03e3517ee1d76f8a71ede360ce58882b9"} Apr 20 20:13:47.391907 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.391843 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76b547796f-td5bz" podStartSLOduration=1.782901369 podStartE2EDuration="5.391829063s" podCreationTimestamp="2026-04-20 20:13:42 +0000 UTC" firstStartedPulling="2026-04-20 20:13:43.042988941 +0000 UTC m=+155.721255975" lastFinishedPulling="2026-04-20 20:13:46.65191663 +0000 UTC m=+159.330183669" observedRunningTime="2026-04-20 20:13:47.390850681 +0000 UTC m=+160.069117749" watchObservedRunningTime="2026-04-20 20:13:47.391829063 +0000 UTC m=+160.070096120" Apr 20 20:13:47.527315 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.527282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-tls\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.529662 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.529642 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/71b72aed-15d6-4289-8b96-b55802ca0734-node-exporter-tls\") pod \"node-exporter-lzw7p\" (UID: \"71b72aed-15d6-4289-8b96-b55802ca0734\") " pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.768008 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:47.767976 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lzw7p" Apr 20 20:13:47.776633 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:13:47.776609 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71b72aed_15d6_4289_8b96_b55802ca0734.slice/crio-1ad60866ffa885f648964c546d8c7f1acb9108f909b1ee2fa511b496f1edbeb8 WatchSource:0}: Error finding container 1ad60866ffa885f648964c546d8c7f1acb9108f909b1ee2fa511b496f1edbeb8: Status 404 returned error can't find the container with id 1ad60866ffa885f648964c546d8c7f1acb9108f909b1ee2fa511b496f1edbeb8 Apr 20 20:13:48.378542 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:48.378513 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lzw7p" event={"ID":"71b72aed-15d6-4289-8b96-b55802ca0734","Type":"ContainerStarted","Data":"1ad60866ffa885f648964c546d8c7f1acb9108f909b1ee2fa511b496f1edbeb8"} Apr 20 20:13:49.382312 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.382278 2578 generic.go:358] "Generic (PLEG): container finished" podID="71b72aed-15d6-4289-8b96-b55802ca0734" containerID="446b48d9cb3b0aed8199d7db16455a431441b45117bc10bbcb59980e580bae4e" exitCode=0 Apr 20 20:13:49.382756 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.382342 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lzw7p" event={"ID":"71b72aed-15d6-4289-8b96-b55802ca0734","Type":"ContainerDied","Data":"446b48d9cb3b0aed8199d7db16455a431441b45117bc10bbcb59980e580bae4e"} Apr 20 20:13:49.639839 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.639773 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:13:49.642085 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.642064 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls\") pod \"image-registry-5f9bf74767-6pq9d\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:13:49.741128 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.741085 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:13:49.741128 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.741137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert\") pod \"ingress-canary-jp7bd\" (UID: \"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7\") " pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:13:49.743433 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.743402 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/14b4f842-f248-42a7-8dcd-51dcb4e2ab36-metrics-tls\") pod \"dns-default-8zpg6\" (UID: \"14b4f842-f248-42a7-8dcd-51dcb4e2ab36\") " pod="openshift-dns/dns-default-8zpg6" Apr 20 20:13:49.743548 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.743509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e10e1c4-64b7-4da9-be2e-38882fc2fbb7-cert\") pod \"ingress-canary-jp7bd\" (UID: \"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7\") " pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:13:49.803853 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.803830 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-c9795dfbf-smds4"] Apr 20 20:13:49.806218 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.806202 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:49.808737 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.808713 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 20:13:49.808737 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.808730 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 20:13:49.808896 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.808756 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-6iosk0k86c4ba\"" Apr 20 20:13:49.808896 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.808796 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-htqqs\"" Apr 20 20:13:49.808998 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.808967 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 20:13:49.809049 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.809002 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 20:13:49.809330 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.809312 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 20:13:49.818689 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.818669 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-c9795dfbf-smds4"] Apr 20 20:13:49.870119 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.870094 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bkrnz\"" Apr 20 20:13:49.878254 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.878239 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8zpg6" Apr 20 20:13:49.943427 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.943369 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-thanos-querier-tls\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:49.943660 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.943455 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvcwb\" (UniqueName: \"kubernetes.io/projected/67b94327-0f61-4d6d-ac15-45fa3a174b44-kube-api-access-xvcwb\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:49.943660 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.943495 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/67b94327-0f61-4d6d-ac15-45fa3a174b44-metrics-client-ca\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:49.944131 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.944079 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:49.944248 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.944127 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:49.944329 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.944245 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-grpc-tls\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:49.944329 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.944286 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:49.944329 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:49.944314 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.000104 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.000075 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8zpg6"] Apr 20 20:13:50.004299 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:13:50.004266 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14b4f842_f248_42a7_8dcd_51dcb4e2ab36.slice/crio-a58f3f2f70402a2b1ddcc4266936065c824416a1a1a64935165ef76bb585fdf1 WatchSource:0}: Error finding container a58f3f2f70402a2b1ddcc4266936065c824416a1a1a64935165ef76bb585fdf1: Status 404 returned error can't find the container with id a58f3f2f70402a2b1ddcc4266936065c824416a1a1a64935165ef76bb585fdf1 Apr 20 20:13:50.045720 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.045692 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-thanos-querier-tls\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.045799 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.045733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvcwb\" (UniqueName: \"kubernetes.io/projected/67b94327-0f61-4d6d-ac15-45fa3a174b44-kube-api-access-xvcwb\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.045799 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.045758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/67b94327-0f61-4d6d-ac15-45fa3a174b44-metrics-client-ca\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.045878 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.045813 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.045878 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.045842 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.045950 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.045904 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-grpc-tls\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.045950 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.045943 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.046123 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.046098 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.046776 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.046748 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/67b94327-0f61-4d6d-ac15-45fa3a174b44-metrics-client-ca\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.048356 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.048323 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.048445 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.048374 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.048698 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.048681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-thanos-querier-tls\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.048839 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.048820 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.049028 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.049011 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.049130 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.049115 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/67b94327-0f61-4d6d-ac15-45fa3a174b44-secret-grpc-tls\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.053347 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.053330 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvcwb\" (UniqueName: \"kubernetes.io/projected/67b94327-0f61-4d6d-ac15-45fa3a174b44-kube-api-access-xvcwb\") pod \"thanos-querier-c9795dfbf-smds4\" (UID: \"67b94327-0f61-4d6d-ac15-45fa3a174b44\") " pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.117303 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.117285 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:50.234829 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.234688 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-c9795dfbf-smds4"] Apr 20 20:13:50.237202 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:13:50.237175 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67b94327_0f61_4d6d_ac15_45fa3a174b44.slice/crio-d345dca674a0fba0c665d2619c05a727310fe26bd3c2895298c16b305f5da616 WatchSource:0}: Error finding container d345dca674a0fba0c665d2619c05a727310fe26bd3c2895298c16b305f5da616: Status 404 returned error can't find the container with id d345dca674a0fba0c665d2619c05a727310fe26bd3c2895298c16b305f5da616 Apr 20 20:13:50.386850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.386820 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lzw7p" event={"ID":"71b72aed-15d6-4289-8b96-b55802ca0734","Type":"ContainerStarted","Data":"08f5b48a718091e0b2cdfb259d825fd7c7d32850966352a55b24c3ae3fa3e49e"} Apr 20 20:13:50.386850 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.386853 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lzw7p" event={"ID":"71b72aed-15d6-4289-8b96-b55802ca0734","Type":"ContainerStarted","Data":"468e642462f3335e42b34b81555c698c3481c99947a7f1f6b39f342534380223"} Apr 20 20:13:50.387972 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.387951 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" event={"ID":"67b94327-0f61-4d6d-ac15-45fa3a174b44","Type":"ContainerStarted","Data":"d345dca674a0fba0c665d2619c05a727310fe26bd3c2895298c16b305f5da616"} Apr 20 20:13:50.388849 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.388824 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8zpg6" event={"ID":"14b4f842-f248-42a7-8dcd-51dcb4e2ab36","Type":"ContainerStarted","Data":"a58f3f2f70402a2b1ddcc4266936065c824416a1a1a64935165ef76bb585fdf1"} Apr 20 20:13:50.404491 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:50.404454 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lzw7p" podStartSLOduration=3.634527097 podStartE2EDuration="4.404443303s" podCreationTimestamp="2026-04-20 20:13:46 +0000 UTC" firstStartedPulling="2026-04-20 20:13:47.779208547 +0000 UTC m=+160.457475586" lastFinishedPulling="2026-04-20 20:13:48.549124746 +0000 UTC m=+161.227391792" observedRunningTime="2026-04-20 20:13:50.402977209 +0000 UTC m=+163.081244268" watchObservedRunningTime="2026-04-20 20:13:50.404443303 +0000 UTC m=+163.082710360" Apr 20 20:13:51.605190 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:51.605162 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-2kk8s"] Apr 20 20:13:51.606924 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:51.606909 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2kk8s" Apr 20 20:13:51.609660 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:51.609638 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-sx68h\"" Apr 20 20:13:51.609769 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:51.609668 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 20:13:51.616004 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:51.615983 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-2kk8s"] Apr 20 20:13:51.660893 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:51.660842 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4de5fc9-8d30-4aad-9ad4-6c3fe0910598-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2kk8s\" (UID: \"b4de5fc9-8d30-4aad-9ad4-6c3fe0910598\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2kk8s" Apr 20 20:13:51.761929 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:51.761783 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4de5fc9-8d30-4aad-9ad4-6c3fe0910598-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2kk8s\" (UID: \"b4de5fc9-8d30-4aad-9ad4-6c3fe0910598\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2kk8s" Apr 20 20:13:51.762052 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:51.761947 2578 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 20 20:13:51.762052 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:13:51.762037 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4de5fc9-8d30-4aad-9ad4-6c3fe0910598-monitoring-plugin-cert podName:b4de5fc9-8d30-4aad-9ad4-6c3fe0910598 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:52.262017013 +0000 UTC m=+164.940284058 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/b4de5fc9-8d30-4aad-9ad4-6c3fe0910598-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-2kk8s" (UID: "b4de5fc9-8d30-4aad-9ad4-6c3fe0910598") : secret "monitoring-plugin-cert" not found Apr 20 20:13:52.267832 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.267795 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4de5fc9-8d30-4aad-9ad4-6c3fe0910598-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2kk8s\" (UID: \"b4de5fc9-8d30-4aad-9ad4-6c3fe0910598\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2kk8s" Apr 20 20:13:52.270526 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.270499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4de5fc9-8d30-4aad-9ad4-6c3fe0910598-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2kk8s\" (UID: \"b4de5fc9-8d30-4aad-9ad4-6c3fe0910598\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2kk8s" Apr 20 20:13:52.399486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.399445 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8zpg6" event={"ID":"14b4f842-f248-42a7-8dcd-51dcb4e2ab36","Type":"ContainerStarted","Data":"300d15e3e93e36d72f15420be3d576dd07680d04a8f58e88daf4f18394dbcb8e"} Apr 20 20:13:52.399486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.399491 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8zpg6" event={"ID":"14b4f842-f248-42a7-8dcd-51dcb4e2ab36","Type":"ContainerStarted","Data":"4dda9a9bb724308a82e1ffcb8ef7d90fe1acdd03221a5808b4564998288eeb43"} Apr 20 20:13:52.399741 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.399623 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8zpg6" Apr 20 20:13:52.415886 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.415828 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8zpg6" podStartSLOduration=129.807913564 podStartE2EDuration="2m11.415810184s" podCreationTimestamp="2026-04-20 20:11:41 +0000 UTC" firstStartedPulling="2026-04-20 20:13:50.005965258 +0000 UTC m=+162.684232292" lastFinishedPulling="2026-04-20 20:13:51.61386186 +0000 UTC m=+164.292128912" observedRunningTime="2026-04-20 20:13:52.415010953 +0000 UTC m=+165.093278017" watchObservedRunningTime="2026-04-20 20:13:52.415810184 +0000 UTC m=+165.094077243" Apr 20 20:13:52.517622 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.517597 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2kk8s" Apr 20 20:13:52.644706 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.644670 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-2kk8s"] Apr 20 20:13:52.647514 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:13:52.647489 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4de5fc9_8d30_4aad_9ad4_6c3fe0910598.slice/crio-dc7e0fe53be699579da4897482704ba7ad46c05d28236177be0f7a15eea9d607 WatchSource:0}: Error finding container dc7e0fe53be699579da4897482704ba7ad46c05d28236177be0f7a15eea9d607: Status 404 returned error can't find the container with id dc7e0fe53be699579da4897482704ba7ad46c05d28236177be0f7a15eea9d607 Apr 20 20:13:52.911446 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.911417 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:52.911623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.911464 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:13:52.912746 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.912722 2578 patch_prober.go:28] interesting pod/console-76b547796f-td5bz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.15:8443/health\": dial tcp 10.132.0.15:8443: connect: connection refused" start-of-body= Apr 20 20:13:52.912808 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.912767 2578 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-76b547796f-td5bz" podUID="db0882fe-9e46-41fe-a40f-02b7855f16d1" containerName="console" probeResult="failure" output="Get \"https://10.132.0.15:8443/health\": dial tcp 10.132.0.15:8443: connect: connection refused" Apr 20 20:13:52.968270 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.968229 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:13:52.970896 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.970875 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:52.973915 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.973600 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 20:13:52.973915 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.973614 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-295oct4p20ok3\"" Apr 20 20:13:52.973915 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.973770 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 20:13:52.973915 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.973872 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-fc9zh\"" Apr 20 20:13:52.973915 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.973873 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 20:13:52.973915 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.973874 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 20:13:52.974235 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.973977 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 20:13:52.974235 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.973994 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 20:13:52.974817 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.974789 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 20:13:52.974947 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.974865 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 20:13:52.974947 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.974931 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 20:13:52.975107 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.974949 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 20:13:52.975107 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.975064 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 20:13:52.975276 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.975187 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 20:13:52.976544 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.976526 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 20:13:52.982626 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:52.982608 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:13:53.073892 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.073845 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2286408f-eeca-4f7e-8808-a066d991c09e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074050 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.073916 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074050 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.073939 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074050 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.073954 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2286408f-eeca-4f7e-8808-a066d991c09e-config-out\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074050 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.073985 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr4dj\" (UniqueName: \"kubernetes.io/projected/2286408f-eeca-4f7e-8808-a066d991c09e-kube-api-access-mr4dj\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074189 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.074049 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074189 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.074104 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074189 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.074130 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074189 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.074151 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074419 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.074213 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074419 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.074236 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074419 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.074257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074419 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.074273 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074419 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.074334 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-config\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074419 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.074368 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074419 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.074406 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074699 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.074432 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.074699 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.074467 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-web-config\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.175898 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.175826 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-config\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.175898 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.175855 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.175898 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.175877 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.175898 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.175896 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.176170 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.175920 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-web-config\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.176170 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.176062 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2286408f-eeca-4f7e-8808-a066d991c09e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.176170 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.176129 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.176170 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.176159 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.176343 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.176186 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2286408f-eeca-4f7e-8808-a066d991c09e-config-out\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.176343 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.176228 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr4dj\" (UniqueName: \"kubernetes.io/projected/2286408f-eeca-4f7e-8808-a066d991c09e-kube-api-access-mr4dj\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.176343 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.176257 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.176343 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.176291 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.176343 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.176330 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.176571 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.176362 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.176571 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.176418 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.176571 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.176454 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.176571 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.176494 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.176571 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.176522 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.177005 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.176699 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.177718 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.177242 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.177718 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.177266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.178036 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.177926 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.179644 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.179622 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-config\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.179911 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.179867 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2286408f-eeca-4f7e-8808-a066d991c09e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.180134 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.180110 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.180221 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.180197 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.180742 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.180699 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.180834 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.180757 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.181024 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.181003 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-web-config\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.181262 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.181241 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.181739 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.181720 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.182388 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.182359 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.182388 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.182376 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.182877 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.182853 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2286408f-eeca-4f7e-8808-a066d991c09e-config-out\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.183029 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.183015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.186717 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.186698 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr4dj\" (UniqueName: \"kubernetes.io/projected/2286408f-eeca-4f7e-8808-a066d991c09e-kube-api-access-mr4dj\") pod \"prometheus-k8s-0\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.280963 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.280922 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:53.408910 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.408876 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" event={"ID":"67b94327-0f61-4d6d-ac15-45fa3a174b44","Type":"ContainerStarted","Data":"f190082e3167c92275bfb60c9bb6cad38dfbe31766af3bd61f35297c373b9eee"} Apr 20 20:13:53.409084 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.408919 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" event={"ID":"67b94327-0f61-4d6d-ac15-45fa3a174b44","Type":"ContainerStarted","Data":"725fd3d05e470b8fb68403210edf8de37e2804ad90239c5de9e3bc3de4492f4e"} Apr 20 20:13:53.409084 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.408933 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" event={"ID":"67b94327-0f61-4d6d-ac15-45fa3a174b44","Type":"ContainerStarted","Data":"f9a48addfc047e4ad01b436d569220321f73b4420664a4fd1b0aacbf5dbe00b6"} Apr 20 20:13:53.411763 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.411720 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2kk8s" event={"ID":"b4de5fc9-8d30-4aad-9ad4-6c3fe0910598","Type":"ContainerStarted","Data":"dc7e0fe53be699579da4897482704ba7ad46c05d28236177be0f7a15eea9d607"} Apr 20 20:13:53.440047 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:53.440021 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:13:53.442140 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:13:53.442114 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2286408f_eeca_4f7e_8808_a066d991c09e.slice/crio-39e179222201de1cbd850e5ba8afd44be7f282d8ee61d6a10f88e234afb0a2c4 WatchSource:0}: Error finding container 39e179222201de1cbd850e5ba8afd44be7f282d8ee61d6a10f88e234afb0a2c4: Status 404 returned error can't find the container with id 39e179222201de1cbd850e5ba8afd44be7f282d8ee61d6a10f88e234afb0a2c4 Apr 20 20:13:54.417331 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:54.417287 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" event={"ID":"67b94327-0f61-4d6d-ac15-45fa3a174b44","Type":"ContainerStarted","Data":"4d13be7b4d7dae58ebe82977f8c1033fc522f37ba2684e62daa3ac8efc8e0115"} Apr 20 20:13:54.417331 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:54.417331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" event={"ID":"67b94327-0f61-4d6d-ac15-45fa3a174b44","Type":"ContainerStarted","Data":"da55c1f46853e504be217b5aed4bbfc39fbf5ef7a07698f699820385e296f2ed"} Apr 20 20:13:54.417830 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:54.417349 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" event={"ID":"67b94327-0f61-4d6d-ac15-45fa3a174b44","Type":"ContainerStarted","Data":"c128b59c2ddafaf4840548a8f9c6971ed3e78fed0960c707bba3b12c6a63b7e7"} Apr 20 20:13:54.417830 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:54.417475 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:13:54.418879 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:54.418855 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2kk8s" event={"ID":"b4de5fc9-8d30-4aad-9ad4-6c3fe0910598","Type":"ContainerStarted","Data":"d5290d52cf3a606f968ca85920909b2dd08c47ef66503bda969c8ffe55286d39"} Apr 20 20:13:54.419075 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:54.419046 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2kk8s" Apr 20 20:13:54.425306 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:54.421760 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2286408f-eeca-4f7e-8808-a066d991c09e","Type":"ContainerStarted","Data":"39e179222201de1cbd850e5ba8afd44be7f282d8ee61d6a10f88e234afb0a2c4"} Apr 20 20:13:54.425306 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:54.425161 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2kk8s" Apr 20 20:13:54.440869 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:54.440816 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" podStartSLOduration=2.074264361 podStartE2EDuration="5.440801098s" podCreationTimestamp="2026-04-20 20:13:49 +0000 UTC" firstStartedPulling="2026-04-20 20:13:50.238992592 +0000 UTC m=+162.917259630" lastFinishedPulling="2026-04-20 20:13:53.605529328 +0000 UTC m=+166.283796367" observedRunningTime="2026-04-20 20:13:54.437969669 +0000 UTC m=+167.116236725" watchObservedRunningTime="2026-04-20 20:13:54.440801098 +0000 UTC m=+167.119068157" Apr 20 20:13:54.452239 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:54.452196 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2kk8s" podStartSLOduration=2.018808615 podStartE2EDuration="3.452182316s" podCreationTimestamp="2026-04-20 20:13:51 +0000 UTC" firstStartedPulling="2026-04-20 20:13:52.649465476 +0000 UTC m=+165.327732511" lastFinishedPulling="2026-04-20 20:13:54.082839174 +0000 UTC m=+166.761106212" observedRunningTime="2026-04-20 20:13:54.451605685 +0000 UTC m=+167.129872737" watchObservedRunningTime="2026-04-20 20:13:54.452182316 +0000 UTC m=+167.130449374" Apr 20 20:13:55.425386 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:55.425340 2578 generic.go:358] "Generic (PLEG): container finished" podID="2286408f-eeca-4f7e-8808-a066d991c09e" containerID="85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1" exitCode=0 Apr 20 20:13:55.425891 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:55.425474 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2286408f-eeca-4f7e-8808-a066d991c09e","Type":"ContainerDied","Data":"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1"} Apr 20 20:13:56.886539 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:56.886505 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:13:56.889607 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:56.889589 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rfvs6\"" Apr 20 20:13:56.897695 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:56.897675 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jp7bd" Apr 20 20:13:57.015069 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:57.015042 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76b547796f-td5bz"] Apr 20 20:13:57.028755 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:57.028441 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jp7bd"] Apr 20 20:13:57.031640 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:13:57.031611 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e10e1c4_64b7_4da9_be2e_38882fc2fbb7.slice/crio-b4fbe82292eaaf127caadbba95b053d578a71e9181183b7f9103fc99a0e7714f WatchSource:0}: Error finding container b4fbe82292eaaf127caadbba95b053d578a71e9181183b7f9103fc99a0e7714f: Status 404 returned error can't find the container with id b4fbe82292eaaf127caadbba95b053d578a71e9181183b7f9103fc99a0e7714f Apr 20 20:13:57.433184 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:57.433148 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jp7bd" event={"ID":"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7","Type":"ContainerStarted","Data":"b4fbe82292eaaf127caadbba95b053d578a71e9181183b7f9103fc99a0e7714f"} Apr 20 20:13:59.890165 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:59.887199 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:13:59.890165 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:59.889635 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:13:59.892468 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:59.892443 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rb8vs\"" Apr 20 20:13:59.900627 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:13:59.900602 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:14:00.028650 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:00.028625 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f9bf74767-6pq9d"] Apr 20 20:14:00.031060 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:14:00.031034 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4764d884_9129_4d18_82c6_332686e75ae2.slice/crio-a7d7568155a3b4b4c68570b49ef39ac0b64816b8772290fedb8e93bc0e7aadd8 WatchSource:0}: Error finding container a7d7568155a3b4b4c68570b49ef39ac0b64816b8772290fedb8e93bc0e7aadd8: Status 404 returned error can't find the container with id a7d7568155a3b4b4c68570b49ef39ac0b64816b8772290fedb8e93bc0e7aadd8 Apr 20 20:14:00.433197 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:00.433170 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-c9795dfbf-smds4" Apr 20 20:14:00.443628 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:00.443595 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jp7bd" event={"ID":"3e10e1c4-64b7-4da9-be2e-38882fc2fbb7","Type":"ContainerStarted","Data":"42c4b86fd36250d130c0b87cc06e887c66a05baf2348aacda40abb5a6471c9d5"} Apr 20 20:14:00.446413 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:00.446390 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2286408f-eeca-4f7e-8808-a066d991c09e","Type":"ContainerStarted","Data":"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4"} Apr 20 20:14:00.446413 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:00.446417 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2286408f-eeca-4f7e-8808-a066d991c09e","Type":"ContainerStarted","Data":"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2"} Apr 20 20:14:00.446600 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:00.446430 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2286408f-eeca-4f7e-8808-a066d991c09e","Type":"ContainerStarted","Data":"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1"} Apr 20 20:14:00.446600 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:00.446440 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2286408f-eeca-4f7e-8808-a066d991c09e","Type":"ContainerStarted","Data":"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9"} Apr 20 20:14:00.446600 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:00.446449 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2286408f-eeca-4f7e-8808-a066d991c09e","Type":"ContainerStarted","Data":"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615"} Apr 20 20:14:00.446600 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:00.446459 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2286408f-eeca-4f7e-8808-a066d991c09e","Type":"ContainerStarted","Data":"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3"} Apr 20 20:14:00.447995 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:00.447976 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" event={"ID":"4764d884-9129-4d18-82c6-332686e75ae2","Type":"ContainerStarted","Data":"c30cf02633ed3ee5e268319071ec0f264a4c0fc4d6b7e50c584634ef57d16dfb"} Apr 20 20:14:00.448096 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:00.448000 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" event={"ID":"4764d884-9129-4d18-82c6-332686e75ae2","Type":"ContainerStarted","Data":"a7d7568155a3b4b4c68570b49ef39ac0b64816b8772290fedb8e93bc0e7aadd8"} Apr 20 20:14:00.448153 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:00.448102 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:14:00.470972 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:00.470923 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" podStartSLOduration=172.470910724 podStartE2EDuration="2m52.470910724s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:14:00.470714254 +0000 UTC m=+173.148981314" watchObservedRunningTime="2026-04-20 20:14:00.470910724 +0000 UTC m=+173.149177819" Apr 20 20:14:00.501776 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:00.501720 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.337844897 podStartE2EDuration="8.501703018s" podCreationTimestamp="2026-04-20 20:13:52 +0000 UTC" firstStartedPulling="2026-04-20 20:13:53.444538242 +0000 UTC m=+166.122805283" lastFinishedPulling="2026-04-20 20:13:59.608396369 +0000 UTC m=+172.286663404" observedRunningTime="2026-04-20 20:14:00.498910625 +0000 UTC m=+173.177177679" watchObservedRunningTime="2026-04-20 20:14:00.501703018 +0000 UTC m=+173.179970079" Apr 20 20:14:00.513695 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:00.513654 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jp7bd" podStartSLOduration=136.939671733 podStartE2EDuration="2m19.513642724s" podCreationTimestamp="2026-04-20 20:11:41 +0000 UTC" firstStartedPulling="2026-04-20 20:13:57.033962562 +0000 UTC m=+169.712229612" lastFinishedPulling="2026-04-20 20:13:59.607933565 +0000 UTC m=+172.286200603" observedRunningTime="2026-04-20 20:14:00.512781561 +0000 UTC m=+173.191048619" watchObservedRunningTime="2026-04-20 20:14:00.513642724 +0000 UTC m=+173.191909846" Apr 20 20:14:02.414486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:02.414456 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8zpg6" Apr 20 20:14:03.281256 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:03.281214 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:14:10.362802 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:10.362774 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5f9bf74767-6pq9d"] Apr 20 20:14:16.499645 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:16.499607 2578 generic.go:358] "Generic (PLEG): container finished" podID="210a9def-653b-4320-b877-97fe51dc9677" containerID="2c9b0330db68ca4d842c8b079f7b603a2f5d3198be971e41ac35b58e7bec9988" exitCode=0 Apr 20 20:14:16.500116 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:16.499666 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8hmpw" event={"ID":"210a9def-653b-4320-b877-97fe51dc9677","Type":"ContainerDied","Data":"2c9b0330db68ca4d842c8b079f7b603a2f5d3198be971e41ac35b58e7bec9988"} Apr 20 20:14:16.500290 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:16.500270 2578 scope.go:117] "RemoveContainer" containerID="2c9b0330db68ca4d842c8b079f7b603a2f5d3198be971e41ac35b58e7bec9988" Apr 20 20:14:17.504852 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:17.504819 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8hmpw" event={"ID":"210a9def-653b-4320-b877-97fe51dc9677","Type":"ContainerStarted","Data":"cdd392356462e5334615d7d7c5130756249f8ef08e5854176545c0b1ef13d6f1"} Apr 20 20:14:20.368197 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:20.368169 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:14:22.038254 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.038214 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76b547796f-td5bz" podUID="db0882fe-9e46-41fe-a40f-02b7855f16d1" containerName="console" containerID="cri-o://61ba0b7682e2adebb7752467c1e462d03e3517ee1d76f8a71ede360ce58882b9" gracePeriod=15 Apr 20 20:14:22.271176 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.271152 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76b547796f-td5bz_db0882fe-9e46-41fe-a40f-02b7855f16d1/console/0.log" Apr 20 20:14:22.271300 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.271229 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:14:22.317265 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.317183 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-serving-cert\") pod \"db0882fe-9e46-41fe-a40f-02b7855f16d1\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " Apr 20 20:14:22.317265 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.317242 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-config\") pod \"db0882fe-9e46-41fe-a40f-02b7855f16d1\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " Apr 20 20:14:22.317441 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.317278 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-service-ca\") pod \"db0882fe-9e46-41fe-a40f-02b7855f16d1\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " Apr 20 20:14:22.317441 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.317300 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jtmc\" (UniqueName: \"kubernetes.io/projected/db0882fe-9e46-41fe-a40f-02b7855f16d1-kube-api-access-8jtmc\") pod \"db0882fe-9e46-41fe-a40f-02b7855f16d1\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " Apr 20 20:14:22.317441 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.317326 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-oauth-config\") pod \"db0882fe-9e46-41fe-a40f-02b7855f16d1\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " Apr 20 20:14:22.317441 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.317350 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-oauth-serving-cert\") pod \"db0882fe-9e46-41fe-a40f-02b7855f16d1\" (UID: \"db0882fe-9e46-41fe-a40f-02b7855f16d1\") " Apr 20 20:14:22.317752 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.317720 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-service-ca" (OuterVolumeSpecName: "service-ca") pod "db0882fe-9e46-41fe-a40f-02b7855f16d1" (UID: "db0882fe-9e46-41fe-a40f-02b7855f16d1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:22.317752 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.317738 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-config" (OuterVolumeSpecName: "console-config") pod "db0882fe-9e46-41fe-a40f-02b7855f16d1" (UID: "db0882fe-9e46-41fe-a40f-02b7855f16d1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:22.317911 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.317877 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "db0882fe-9e46-41fe-a40f-02b7855f16d1" (UID: "db0882fe-9e46-41fe-a40f-02b7855f16d1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:22.319803 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.319780 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "db0882fe-9e46-41fe-a40f-02b7855f16d1" (UID: "db0882fe-9e46-41fe-a40f-02b7855f16d1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:14:22.319803 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.319792 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "db0882fe-9e46-41fe-a40f-02b7855f16d1" (UID: "db0882fe-9e46-41fe-a40f-02b7855f16d1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:14:22.319925 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.319842 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db0882fe-9e46-41fe-a40f-02b7855f16d1-kube-api-access-8jtmc" (OuterVolumeSpecName: "kube-api-access-8jtmc") pod "db0882fe-9e46-41fe-a40f-02b7855f16d1" (UID: "db0882fe-9e46-41fe-a40f-02b7855f16d1"). InnerVolumeSpecName "kube-api-access-8jtmc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:14:22.418090 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.418052 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-service-ca\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:14:22.418090 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.418083 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8jtmc\" (UniqueName: \"kubernetes.io/projected/db0882fe-9e46-41fe-a40f-02b7855f16d1-kube-api-access-8jtmc\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:14:22.418090 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.418094 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-oauth-config\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:14:22.418090 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.418103 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-oauth-serving-cert\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:14:22.418416 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.418112 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-serving-cert\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:14:22.418416 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.418123 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db0882fe-9e46-41fe-a40f-02b7855f16d1-console-config\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:14:22.522073 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.522045 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76b547796f-td5bz_db0882fe-9e46-41fe-a40f-02b7855f16d1/console/0.log" Apr 20 20:14:22.522263 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.522087 2578 generic.go:358] "Generic (PLEG): container finished" podID="db0882fe-9e46-41fe-a40f-02b7855f16d1" containerID="61ba0b7682e2adebb7752467c1e462d03e3517ee1d76f8a71ede360ce58882b9" exitCode=2 Apr 20 20:14:22.522263 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.522145 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b547796f-td5bz" event={"ID":"db0882fe-9e46-41fe-a40f-02b7855f16d1","Type":"ContainerDied","Data":"61ba0b7682e2adebb7752467c1e462d03e3517ee1d76f8a71ede360ce58882b9"} Apr 20 20:14:22.522263 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.522172 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b547796f-td5bz" Apr 20 20:14:22.522263 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.522191 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b547796f-td5bz" event={"ID":"db0882fe-9e46-41fe-a40f-02b7855f16d1","Type":"ContainerDied","Data":"f85cb3480073c69905ea03edab6c6e8267408c43c5c230336c3b620fefd94d03"} Apr 20 20:14:22.522263 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.522205 2578 scope.go:117] "RemoveContainer" containerID="61ba0b7682e2adebb7752467c1e462d03e3517ee1d76f8a71ede360ce58882b9" Apr 20 20:14:22.530693 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.530674 2578 scope.go:117] "RemoveContainer" containerID="61ba0b7682e2adebb7752467c1e462d03e3517ee1d76f8a71ede360ce58882b9" Apr 20 20:14:22.530984 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:14:22.530965 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61ba0b7682e2adebb7752467c1e462d03e3517ee1d76f8a71ede360ce58882b9\": container with ID starting with 61ba0b7682e2adebb7752467c1e462d03e3517ee1d76f8a71ede360ce58882b9 not found: ID does not exist" containerID="61ba0b7682e2adebb7752467c1e462d03e3517ee1d76f8a71ede360ce58882b9" Apr 20 20:14:22.531021 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.530995 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61ba0b7682e2adebb7752467c1e462d03e3517ee1d76f8a71ede360ce58882b9"} err="failed to get container status \"61ba0b7682e2adebb7752467c1e462d03e3517ee1d76f8a71ede360ce58882b9\": rpc error: code = NotFound desc = could not find container \"61ba0b7682e2adebb7752467c1e462d03e3517ee1d76f8a71ede360ce58882b9\": container with ID starting with 61ba0b7682e2adebb7752467c1e462d03e3517ee1d76f8a71ede360ce58882b9 not found: ID does not exist" Apr 20 20:14:22.542629 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.542601 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76b547796f-td5bz"] Apr 20 20:14:22.547996 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:22.547973 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76b547796f-td5bz"] Apr 20 20:14:23.892704 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:23.892668 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db0882fe-9e46-41fe-a40f-02b7855f16d1" path="/var/lib/kubelet/pods/db0882fe-9e46-41fe-a40f-02b7855f16d1/volumes" Apr 20 20:14:35.381503 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:35.381436 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" podUID="4764d884-9129-4d18-82c6-332686e75ae2" containerName="registry" containerID="cri-o://c30cf02633ed3ee5e268319071ec0f264a4c0fc4d6b7e50c584634ef57d16dfb" gracePeriod=30 Apr 20 20:14:36.563790 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.563755 2578 generic.go:358] "Generic (PLEG): container finished" podID="4764d884-9129-4d18-82c6-332686e75ae2" containerID="c30cf02633ed3ee5e268319071ec0f264a4c0fc4d6b7e50c584634ef57d16dfb" exitCode=0 Apr 20 20:14:36.564135 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.563830 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" event={"ID":"4764d884-9129-4d18-82c6-332686e75ae2","Type":"ContainerDied","Data":"c30cf02633ed3ee5e268319071ec0f264a4c0fc4d6b7e50c584634ef57d16dfb"} Apr 20 20:14:36.618148 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.618130 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:14:36.726567 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.726506 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-bound-sa-token\") pod \"4764d884-9129-4d18-82c6-332686e75ae2\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " Apr 20 20:14:36.726567 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.726540 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls\") pod \"4764d884-9129-4d18-82c6-332686e75ae2\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " Apr 20 20:14:36.726730 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.726596 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4764d884-9129-4d18-82c6-332686e75ae2-registry-certificates\") pod \"4764d884-9129-4d18-82c6-332686e75ae2\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " Apr 20 20:14:36.726730 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.726632 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7w6c\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-kube-api-access-r7w6c\") pod \"4764d884-9129-4d18-82c6-332686e75ae2\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " Apr 20 20:14:36.726730 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.726666 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4764d884-9129-4d18-82c6-332686e75ae2-installation-pull-secrets\") pod \"4764d884-9129-4d18-82c6-332686e75ae2\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " Apr 20 20:14:36.726846 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.726767 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4764d884-9129-4d18-82c6-332686e75ae2-image-registry-private-configuration\") pod \"4764d884-9129-4d18-82c6-332686e75ae2\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " Apr 20 20:14:36.726846 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.726801 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4764d884-9129-4d18-82c6-332686e75ae2-ca-trust-extracted\") pod \"4764d884-9129-4d18-82c6-332686e75ae2\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " Apr 20 20:14:36.726846 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.726827 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4764d884-9129-4d18-82c6-332686e75ae2-trusted-ca\") pod \"4764d884-9129-4d18-82c6-332686e75ae2\" (UID: \"4764d884-9129-4d18-82c6-332686e75ae2\") " Apr 20 20:14:36.727146 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.727118 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4764d884-9129-4d18-82c6-332686e75ae2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4764d884-9129-4d18-82c6-332686e75ae2" (UID: "4764d884-9129-4d18-82c6-332686e75ae2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:36.727410 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.727387 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4764d884-9129-4d18-82c6-332686e75ae2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4764d884-9129-4d18-82c6-332686e75ae2" (UID: "4764d884-9129-4d18-82c6-332686e75ae2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:36.729615 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.729553 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4764d884-9129-4d18-82c6-332686e75ae2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4764d884-9129-4d18-82c6-332686e75ae2" (UID: "4764d884-9129-4d18-82c6-332686e75ae2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:14:36.729736 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.729609 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4764d884-9129-4d18-82c6-332686e75ae2" (UID: "4764d884-9129-4d18-82c6-332686e75ae2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:14:36.729736 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.729616 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4764d884-9129-4d18-82c6-332686e75ae2" (UID: "4764d884-9129-4d18-82c6-332686e75ae2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:14:36.729736 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.729621 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4764d884-9129-4d18-82c6-332686e75ae2-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4764d884-9129-4d18-82c6-332686e75ae2" (UID: "4764d884-9129-4d18-82c6-332686e75ae2"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:14:36.729736 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.729698 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-kube-api-access-r7w6c" (OuterVolumeSpecName: "kube-api-access-r7w6c") pod "4764d884-9129-4d18-82c6-332686e75ae2" (UID: "4764d884-9129-4d18-82c6-332686e75ae2"). InnerVolumeSpecName "kube-api-access-r7w6c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:14:36.734813 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.734788 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4764d884-9129-4d18-82c6-332686e75ae2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4764d884-9129-4d18-82c6-332686e75ae2" (UID: "4764d884-9129-4d18-82c6-332686e75ae2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:14:36.827667 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.827648 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-bound-sa-token\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:14:36.827667 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.827668 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-registry-tls\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:14:36.827797 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.827676 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4764d884-9129-4d18-82c6-332686e75ae2-registry-certificates\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:14:36.827797 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.827686 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r7w6c\" (UniqueName: \"kubernetes.io/projected/4764d884-9129-4d18-82c6-332686e75ae2-kube-api-access-r7w6c\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:14:36.827797 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.827696 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4764d884-9129-4d18-82c6-332686e75ae2-installation-pull-secrets\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:14:36.827797 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.827705 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4764d884-9129-4d18-82c6-332686e75ae2-image-registry-private-configuration\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:14:36.827797 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.827715 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4764d884-9129-4d18-82c6-332686e75ae2-ca-trust-extracted\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:14:36.827797 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:36.827723 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4764d884-9129-4d18-82c6-332686e75ae2-trusted-ca\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:14:37.568081 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:37.568043 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" event={"ID":"4764d884-9129-4d18-82c6-332686e75ae2","Type":"ContainerDied","Data":"a7d7568155a3b4b4c68570b49ef39ac0b64816b8772290fedb8e93bc0e7aadd8"} Apr 20 20:14:37.568565 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:37.568085 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f9bf74767-6pq9d" Apr 20 20:14:37.568565 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:37.568095 2578 scope.go:117] "RemoveContainer" containerID="c30cf02633ed3ee5e268319071ec0f264a4c0fc4d6b7e50c584634ef57d16dfb" Apr 20 20:14:37.588909 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:37.588885 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5f9bf74767-6pq9d"] Apr 20 20:14:37.591919 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:37.591897 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5f9bf74767-6pq9d"] Apr 20 20:14:37.889923 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:37.889861 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4764d884-9129-4d18-82c6-332686e75ae2" path="/var/lib/kubelet/pods/4764d884-9129-4d18-82c6-332686e75ae2/volumes" Apr 20 20:14:53.282119 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:53.282088 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:14:53.303015 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:53.302984 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:14:53.630249 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:14:53.630188 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.254422 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.254379 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:15:11.254928 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.254869 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="prometheus" containerID="cri-o://68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3" gracePeriod=600 Apr 20 20:15:11.255041 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.254915 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="kube-rbac-proxy" containerID="cri-o://06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2" gracePeriod=600 Apr 20 20:15:11.255041 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.254952 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="kube-rbac-proxy-web" containerID="cri-o://3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1" gracePeriod=600 Apr 20 20:15:11.255041 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.254937 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="thanos-sidecar" containerID="cri-o://09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9" gracePeriod=600 Apr 20 20:15:11.255041 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.254967 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="config-reloader" containerID="cri-o://050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615" gracePeriod=600 Apr 20 20:15:11.255266 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.255102 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="kube-rbac-proxy-thanos" containerID="cri-o://51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4" gracePeriod=600 Apr 20 20:15:11.496656 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.496632 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.580189 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580123 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.580189 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580156 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2286408f-eeca-4f7e-8808-a066d991c09e-config-out\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.580189 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580181 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-metrics-client-certs\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.580432 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580207 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-serving-certs-ca-bundle\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.580432 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580226 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.580432 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580358 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-thanos-prometheus-http-client-file\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.580432 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580398 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2286408f-eeca-4f7e-8808-a066d991c09e-tls-assets\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.580432 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580423 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr4dj\" (UniqueName: \"kubernetes.io/projected/2286408f-eeca-4f7e-8808-a066d991c09e-kube-api-access-mr4dj\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.580709 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580455 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-metrics-client-ca\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.580709 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580486 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-trusted-ca-bundle\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.580709 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580516 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-web-config\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.580709 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580544 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-grpc-tls\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.580709 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580618 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-k8s-db\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.580709 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580652 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-kube-rbac-proxy\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.580709 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580643 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:11.580709 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580675 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-config\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.581095 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580712 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-kubelet-serving-ca-bundle\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.581095 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580741 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-tls\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.581095 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.580766 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-k8s-rulefiles-0\") pod \"2286408f-eeca-4f7e-8808-a066d991c09e\" (UID: \"2286408f-eeca-4f7e-8808-a066d991c09e\") " Apr 20 20:15:11.581276 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.581146 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:11.581276 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.581251 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.581276 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.581273 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-trusted-ca-bundle\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.582431 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.582165 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:11.582557 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.582523 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:11.583353 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.583310 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:11.583633 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.583356 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:11.583633 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.583572 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:11.583633 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.583613 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:11.583820 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.583656 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:11.583945 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.583912 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:15:11.584298 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.584263 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2286408f-eeca-4f7e-8808-a066d991c09e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:15:11.584725 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.584701 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2286408f-eeca-4f7e-8808-a066d991c09e-config-out" (OuterVolumeSpecName: "config-out") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:15:11.584989 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.584969 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-config" (OuterVolumeSpecName: "config") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:11.585111 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.585092 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2286408f-eeca-4f7e-8808-a066d991c09e-kube-api-access-mr4dj" (OuterVolumeSpecName: "kube-api-access-mr4dj") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "kube-api-access-mr4dj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:15:11.585540 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.585523 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:11.585746 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.585650 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:11.585828 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.585776 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:11.595721 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.595692 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-web-config" (OuterVolumeSpecName: "web-config") pod "2286408f-eeca-4f7e-8808-a066d991c09e" (UID: "2286408f-eeca-4f7e-8808-a066d991c09e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:11.664184 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.664157 2578 generic.go:358] "Generic (PLEG): container finished" podID="2286408f-eeca-4f7e-8808-a066d991c09e" containerID="51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4" exitCode=0 Apr 20 20:15:11.664264 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.664184 2578 generic.go:358] "Generic (PLEG): container finished" podID="2286408f-eeca-4f7e-8808-a066d991c09e" containerID="06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2" exitCode=0 Apr 20 20:15:11.664264 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.664193 2578 generic.go:358] "Generic (PLEG): container finished" podID="2286408f-eeca-4f7e-8808-a066d991c09e" containerID="3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1" exitCode=0 Apr 20 20:15:11.664264 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.664204 2578 generic.go:358] "Generic (PLEG): container finished" podID="2286408f-eeca-4f7e-8808-a066d991c09e" containerID="09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9" exitCode=0 Apr 20 20:15:11.664264 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.664211 2578 generic.go:358] "Generic (PLEG): container finished" podID="2286408f-eeca-4f7e-8808-a066d991c09e" containerID="050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615" exitCode=0 Apr 20 20:15:11.664264 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.664218 2578 generic.go:358] "Generic (PLEG): container finished" podID="2286408f-eeca-4f7e-8808-a066d991c09e" containerID="68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3" exitCode=0 Apr 20 20:15:11.664264 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.664243 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2286408f-eeca-4f7e-8808-a066d991c09e","Type":"ContainerDied","Data":"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4"} Apr 20 20:15:11.664444 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.664282 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2286408f-eeca-4f7e-8808-a066d991c09e","Type":"ContainerDied","Data":"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2"} Apr 20 20:15:11.664444 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.664287 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.664444 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.664294 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2286408f-eeca-4f7e-8808-a066d991c09e","Type":"ContainerDied","Data":"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1"} Apr 20 20:15:11.664444 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.664303 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2286408f-eeca-4f7e-8808-a066d991c09e","Type":"ContainerDied","Data":"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9"} Apr 20 20:15:11.664444 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.664312 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2286408f-eeca-4f7e-8808-a066d991c09e","Type":"ContainerDied","Data":"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615"} Apr 20 20:15:11.664444 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.664320 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2286408f-eeca-4f7e-8808-a066d991c09e","Type":"ContainerDied","Data":"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3"} Apr 20 20:15:11.664444 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.664329 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2286408f-eeca-4f7e-8808-a066d991c09e","Type":"ContainerDied","Data":"39e179222201de1cbd850e5ba8afd44be7f282d8ee61d6a10f88e234afb0a2c4"} Apr 20 20:15:11.664444 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.664347 2578 scope.go:117] "RemoveContainer" containerID="51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4" Apr 20 20:15:11.672083 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.671993 2578 scope.go:117] "RemoveContainer" containerID="06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2" Apr 20 20:15:11.678943 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.678928 2578 scope.go:117] "RemoveContainer" containerID="3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1" Apr 20 20:15:11.682023 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.681998 2578 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-metrics-client-certs\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.682023 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.682018 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.682120 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.682035 2578 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-thanos-prometheus-http-client-file\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.682120 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.682047 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2286408f-eeca-4f7e-8808-a066d991c09e-tls-assets\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.682120 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.682057 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mr4dj\" (UniqueName: \"kubernetes.io/projected/2286408f-eeca-4f7e-8808-a066d991c09e-kube-api-access-mr4dj\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.682120 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.682066 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-metrics-client-ca\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.682120 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.682074 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-web-config\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.682120 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.682082 2578 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-grpc-tls\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.682120 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.682090 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-k8s-db\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.682120 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.682099 2578 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-kube-rbac-proxy\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.682120 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.682112 2578 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-config\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.682120 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.682121 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.682446 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.682130 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-tls\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.682446 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.682139 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2286408f-eeca-4f7e-8808-a066d991c09e-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.682446 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.682148 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2286408f-eeca-4f7e-8808-a066d991c09e-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.682446 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.682157 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2286408f-eeca-4f7e-8808-a066d991c09e-config-out\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:15:11.685107 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.685089 2578 scope.go:117] "RemoveContainer" containerID="09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9" Apr 20 20:15:11.687338 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.687317 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:15:11.691366 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.691350 2578 scope.go:117] "RemoveContainer" containerID="050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615" Apr 20 20:15:11.692008 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.691989 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:15:11.697277 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.697263 2578 scope.go:117] "RemoveContainer" containerID="68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3" Apr 20 20:15:11.703903 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.703889 2578 scope.go:117] "RemoveContainer" containerID="85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1" Apr 20 20:15:11.709787 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.709773 2578 scope.go:117] "RemoveContainer" containerID="51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4" Apr 20 20:15:11.710039 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:15:11.710018 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4\": container with ID starting with 51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4 not found: ID does not exist" containerID="51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4" Apr 20 20:15:11.710121 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.710045 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4"} err="failed to get container status \"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4\": rpc error: code = NotFound desc = could not find container \"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4\": container with ID starting with 51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4 not found: ID does not exist" Apr 20 20:15:11.710121 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.710066 2578 scope.go:117] "RemoveContainer" containerID="06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2" Apr 20 20:15:11.710307 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:15:11.710289 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2\": container with ID starting with 06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2 not found: ID does not exist" containerID="06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2" Apr 20 20:15:11.710352 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.710314 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2"} err="failed to get container status \"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2\": rpc error: code = NotFound desc = could not find container \"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2\": container with ID starting with 06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2 not found: ID does not exist" Apr 20 20:15:11.710352 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.710330 2578 scope.go:117] "RemoveContainer" containerID="3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1" Apr 20 20:15:11.710553 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:15:11.710537 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1\": container with ID starting with 3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1 not found: ID does not exist" containerID="3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1" Apr 20 20:15:11.710626 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.710556 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1"} err="failed to get container status \"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1\": rpc error: code = NotFound desc = could not find container \"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1\": container with ID starting with 3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1 not found: ID does not exist" Apr 20 20:15:11.710626 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.710603 2578 scope.go:117] "RemoveContainer" containerID="09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9" Apr 20 20:15:11.710857 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:15:11.710842 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9\": container with ID starting with 09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9 not found: ID does not exist" containerID="09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9" Apr 20 20:15:11.710915 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.710862 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9"} err="failed to get container status \"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9\": rpc error: code = NotFound desc = could not find container \"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9\": container with ID starting with 09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9 not found: ID does not exist" Apr 20 20:15:11.710915 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.710875 2578 scope.go:117] "RemoveContainer" containerID="050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615" Apr 20 20:15:11.711123 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:15:11.711107 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615\": container with ID starting with 050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615 not found: ID does not exist" containerID="050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615" Apr 20 20:15:11.711162 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.711127 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615"} err="failed to get container status \"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615\": rpc error: code = NotFound desc = could not find container \"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615\": container with ID starting with 050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615 not found: ID does not exist" Apr 20 20:15:11.711162 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.711142 2578 scope.go:117] "RemoveContainer" containerID="68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3" Apr 20 20:15:11.711355 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:15:11.711340 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3\": container with ID starting with 68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3 not found: ID does not exist" containerID="68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3" Apr 20 20:15:11.711413 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.711358 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3"} err="failed to get container status \"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3\": rpc error: code = NotFound desc = could not find container \"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3\": container with ID starting with 68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3 not found: ID does not exist" Apr 20 20:15:11.711413 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.711369 2578 scope.go:117] "RemoveContainer" containerID="85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1" Apr 20 20:15:11.711566 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:15:11.711552 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1\": container with ID starting with 85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1 not found: ID does not exist" containerID="85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1" Apr 20 20:15:11.711638 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.711568 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1"} err="failed to get container status \"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1\": rpc error: code = NotFound desc = could not find container \"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1\": container with ID starting with 85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1 not found: ID does not exist" Apr 20 20:15:11.711638 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.711601 2578 scope.go:117] "RemoveContainer" containerID="51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4" Apr 20 20:15:11.711821 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.711803 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4"} err="failed to get container status \"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4\": rpc error: code = NotFound desc = could not find container \"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4\": container with ID starting with 51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4 not found: ID does not exist" Apr 20 20:15:11.711863 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.711823 2578 scope.go:117] "RemoveContainer" containerID="06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2" Apr 20 20:15:11.712041 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.712022 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2"} err="failed to get container status \"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2\": rpc error: code = NotFound desc = could not find container \"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2\": container with ID starting with 06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2 not found: ID does not exist" Apr 20 20:15:11.712093 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.712041 2578 scope.go:117] "RemoveContainer" containerID="3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1" Apr 20 20:15:11.712234 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.712220 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1"} err="failed to get container status \"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1\": rpc error: code = NotFound desc = could not find container \"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1\": container with ID starting with 3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1 not found: ID does not exist" Apr 20 20:15:11.712275 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.712237 2578 scope.go:117] "RemoveContainer" containerID="09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9" Apr 20 20:15:11.712419 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.712405 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9"} err="failed to get container status \"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9\": rpc error: code = NotFound desc = could not find container \"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9\": container with ID starting with 09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9 not found: ID does not exist" Apr 20 20:15:11.712465 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.712420 2578 scope.go:117] "RemoveContainer" containerID="050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615" Apr 20 20:15:11.712645 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.712629 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615"} err="failed to get container status \"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615\": rpc error: code = NotFound desc = could not find container \"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615\": container with ID starting with 050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615 not found: ID does not exist" Apr 20 20:15:11.712695 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.712646 2578 scope.go:117] "RemoveContainer" containerID="68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3" Apr 20 20:15:11.712863 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.712846 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3"} err="failed to get container status \"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3\": rpc error: code = NotFound desc = could not find container \"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3\": container with ID starting with 68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3 not found: ID does not exist" Apr 20 20:15:11.712914 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.712863 2578 scope.go:117] "RemoveContainer" containerID="85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1" Apr 20 20:15:11.713076 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.713059 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1"} err="failed to get container status \"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1\": rpc error: code = NotFound desc = could not find container \"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1\": container with ID starting with 85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1 not found: ID does not exist" Apr 20 20:15:11.713117 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.713077 2578 scope.go:117] "RemoveContainer" containerID="51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4" Apr 20 20:15:11.713268 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.713250 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4"} err="failed to get container status \"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4\": rpc error: code = NotFound desc = could not find container \"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4\": container with ID starting with 51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4 not found: ID does not exist" Apr 20 20:15:11.713311 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.713269 2578 scope.go:117] "RemoveContainer" containerID="06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2" Apr 20 20:15:11.713463 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.713446 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2"} err="failed to get container status \"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2\": rpc error: code = NotFound desc = could not find container \"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2\": container with ID starting with 06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2 not found: ID does not exist" Apr 20 20:15:11.713463 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.713462 2578 scope.go:117] "RemoveContainer" containerID="3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1" Apr 20 20:15:11.713652 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.713637 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1"} err="failed to get container status \"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1\": rpc error: code = NotFound desc = could not find container \"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1\": container with ID starting with 3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1 not found: ID does not exist" Apr 20 20:15:11.713691 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.713652 2578 scope.go:117] "RemoveContainer" containerID="09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9" Apr 20 20:15:11.713964 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.713936 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9"} err="failed to get container status \"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9\": rpc error: code = NotFound desc = could not find container \"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9\": container with ID starting with 09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9 not found: ID does not exist" Apr 20 20:15:11.713964 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.713964 2578 scope.go:117] "RemoveContainer" containerID="050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615" Apr 20 20:15:11.714226 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.714208 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615"} err="failed to get container status \"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615\": rpc error: code = NotFound desc = could not find container \"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615\": container with ID starting with 050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615 not found: ID does not exist" Apr 20 20:15:11.714315 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.714227 2578 scope.go:117] "RemoveContainer" containerID="68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3" Apr 20 20:15:11.714528 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.714455 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3"} err="failed to get container status \"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3\": rpc error: code = NotFound desc = could not find container \"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3\": container with ID starting with 68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3 not found: ID does not exist" Apr 20 20:15:11.714528 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.714484 2578 scope.go:117] "RemoveContainer" containerID="85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1" Apr 20 20:15:11.714847 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.714820 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1"} err="failed to get container status \"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1\": rpc error: code = NotFound desc = could not find container \"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1\": container with ID starting with 85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1 not found: ID does not exist" Apr 20 20:15:11.714912 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.714850 2578 scope.go:117] "RemoveContainer" containerID="51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4" Apr 20 20:15:11.715099 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.715073 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4"} err="failed to get container status \"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4\": rpc error: code = NotFound desc = could not find container \"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4\": container with ID starting with 51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4 not found: ID does not exist" Apr 20 20:15:11.715099 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.715098 2578 scope.go:117] "RemoveContainer" containerID="06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2" Apr 20 20:15:11.715289 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.715271 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2"} err="failed to get container status \"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2\": rpc error: code = NotFound desc = could not find container \"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2\": container with ID starting with 06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2 not found: ID does not exist" Apr 20 20:15:11.715342 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.715291 2578 scope.go:117] "RemoveContainer" containerID="3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1" Apr 20 20:15:11.715481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.715465 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1"} err="failed to get container status \"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1\": rpc error: code = NotFound desc = could not find container \"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1\": container with ID starting with 3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1 not found: ID does not exist" Apr 20 20:15:11.715481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.715480 2578 scope.go:117] "RemoveContainer" containerID="09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9" Apr 20 20:15:11.715732 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.715711 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:15:11.715806 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.715726 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9"} err="failed to get container status \"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9\": rpc error: code = NotFound desc = could not find container \"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9\": container with ID starting with 09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9 not found: ID does not exist" Apr 20 20:15:11.715806 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.715746 2578 scope.go:117] "RemoveContainer" containerID="050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615" Apr 20 20:15:11.715973 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.715957 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615"} err="failed to get container status \"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615\": rpc error: code = NotFound desc = could not find container \"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615\": container with ID starting with 050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615 not found: ID does not exist" Apr 20 20:15:11.716012 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.715975 2578 scope.go:117] "RemoveContainer" containerID="68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3" Apr 20 20:15:11.716055 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716029 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="config-reloader" Apr 20 20:15:11.716155 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716055 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="config-reloader" Apr 20 20:15:11.716155 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716069 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="kube-rbac-proxy-web" Apr 20 20:15:11.716155 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716077 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="kube-rbac-proxy-web" Apr 20 20:15:11.716155 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716093 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4764d884-9129-4d18-82c6-332686e75ae2" containerName="registry" Apr 20 20:15:11.716155 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716101 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4764d884-9129-4d18-82c6-332686e75ae2" containerName="registry" Apr 20 20:15:11.716155 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716112 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db0882fe-9e46-41fe-a40f-02b7855f16d1" containerName="console" Apr 20 20:15:11.716155 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716120 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="db0882fe-9e46-41fe-a40f-02b7855f16d1" containerName="console" Apr 20 20:15:11.716155 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716133 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="kube-rbac-proxy-thanos" Apr 20 20:15:11.716155 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716141 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="kube-rbac-proxy-thanos" Apr 20 20:15:11.716155 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716153 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="thanos-sidecar" Apr 20 20:15:11.716155 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716158 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="thanos-sidecar" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716164 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="kube-rbac-proxy" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716169 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="kube-rbac-proxy" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716175 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="init-config-reloader" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716180 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="init-config-reloader" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716185 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="prometheus" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716189 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="prometheus" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716208 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3"} err="failed to get container status \"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3\": rpc error: code = NotFound desc = could not find container \"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3\": container with ID starting with 68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3 not found: ID does not exist" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716225 2578 scope.go:117] "RemoveContainer" containerID="85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716235 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="thanos-sidecar" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716243 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="kube-rbac-proxy" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716249 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4764d884-9129-4d18-82c6-332686e75ae2" containerName="registry" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716256 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="prometheus" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716261 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="kube-rbac-proxy-web" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716267 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="kube-rbac-proxy-thanos" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716275 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="db0882fe-9e46-41fe-a40f-02b7855f16d1" containerName="console" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716281 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" containerName="config-reloader" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716430 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1"} err="failed to get container status \"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1\": rpc error: code = NotFound desc = could not find container \"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1\": container with ID starting with 85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1 not found: ID does not exist" Apr 20 20:15:11.716486 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716444 2578 scope.go:117] "RemoveContainer" containerID="51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4" Apr 20 20:15:11.717122 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716669 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4"} err="failed to get container status \"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4\": rpc error: code = NotFound desc = could not find container \"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4\": container with ID starting with 51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4 not found: ID does not exist" Apr 20 20:15:11.717122 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716682 2578 scope.go:117] "RemoveContainer" containerID="06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2" Apr 20 20:15:11.717122 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716844 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2"} err="failed to get container status \"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2\": rpc error: code = NotFound desc = could not find container \"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2\": container with ID starting with 06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2 not found: ID does not exist" Apr 20 20:15:11.717122 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.716857 2578 scope.go:117] "RemoveContainer" containerID="3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1" Apr 20 20:15:11.717122 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.717064 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1"} err="failed to get container status \"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1\": rpc error: code = NotFound desc = could not find container \"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1\": container with ID starting with 3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1 not found: ID does not exist" Apr 20 20:15:11.717122 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.717079 2578 scope.go:117] "RemoveContainer" containerID="09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9" Apr 20 20:15:11.717371 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.717281 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9"} err="failed to get container status \"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9\": rpc error: code = NotFound desc = could not find container \"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9\": container with ID starting with 09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9 not found: ID does not exist" Apr 20 20:15:11.717371 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.717305 2578 scope.go:117] "RemoveContainer" containerID="050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615" Apr 20 20:15:11.717551 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.717535 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615"} err="failed to get container status \"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615\": rpc error: code = NotFound desc = could not find container \"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615\": container with ID starting with 050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615 not found: ID does not exist" Apr 20 20:15:11.717627 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.717554 2578 scope.go:117] "RemoveContainer" containerID="68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3" Apr 20 20:15:11.717802 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.717783 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3"} err="failed to get container status \"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3\": rpc error: code = NotFound desc = could not find container \"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3\": container with ID starting with 68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3 not found: ID does not exist" Apr 20 20:15:11.717861 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.717803 2578 scope.go:117] "RemoveContainer" containerID="85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1" Apr 20 20:15:11.718071 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.718048 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1"} err="failed to get container status \"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1\": rpc error: code = NotFound desc = could not find container \"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1\": container with ID starting with 85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1 not found: ID does not exist" Apr 20 20:15:11.718071 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.718071 2578 scope.go:117] "RemoveContainer" containerID="51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4" Apr 20 20:15:11.718300 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.718282 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4"} err="failed to get container status \"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4\": rpc error: code = NotFound desc = could not find container \"51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4\": container with ID starting with 51af1d845c296ed15100a5f257855430c87d78e9961950ed54710ffb3b65fce4 not found: ID does not exist" Apr 20 20:15:11.718340 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.718302 2578 scope.go:117] "RemoveContainer" containerID="06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2" Apr 20 20:15:11.718568 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.718551 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2"} err="failed to get container status \"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2\": rpc error: code = NotFound desc = could not find container \"06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2\": container with ID starting with 06f5dbf6e846ac95ace3e1630288a882f867627d182e4c5cc66bc7e0395f68c2 not found: ID does not exist" Apr 20 20:15:11.718633 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.718569 2578 scope.go:117] "RemoveContainer" containerID="3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1" Apr 20 20:15:11.718823 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.718805 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1"} err="failed to get container status \"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1\": rpc error: code = NotFound desc = could not find container \"3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1\": container with ID starting with 3ba933648c9f846144cb2a5c8dd2e5154204821b390d335722ba7b81120abdf1 not found: ID does not exist" Apr 20 20:15:11.718894 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.718825 2578 scope.go:117] "RemoveContainer" containerID="09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9" Apr 20 20:15:11.719038 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.719018 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9"} err="failed to get container status \"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9\": rpc error: code = NotFound desc = could not find container \"09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9\": container with ID starting with 09c6301a8f001e224de7ee62ae71d6d73c2d682cf1205e445d726c70dc7d7af9 not found: ID does not exist" Apr 20 20:15:11.719080 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.719039 2578 scope.go:117] "RemoveContainer" containerID="050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615" Apr 20 20:15:11.719223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.719205 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615"} err="failed to get container status \"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615\": rpc error: code = NotFound desc = could not find container \"050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615\": container with ID starting with 050694b34f9ca847606594eb3f743da9679b5240137502475b7d4916be831615 not found: ID does not exist" Apr 20 20:15:11.719288 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.719224 2578 scope.go:117] "RemoveContainer" containerID="68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3" Apr 20 20:15:11.719426 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.719408 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3"} err="failed to get container status \"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3\": rpc error: code = NotFound desc = could not find container \"68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3\": container with ID starting with 68e2584b8293e27bca3475d97c9232bf2a7899b7cf93aef925a08db59b24a6b3 not found: ID does not exist" Apr 20 20:15:11.719471 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.719427 2578 scope.go:117] "RemoveContainer" containerID="85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1" Apr 20 20:15:11.719612 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.719592 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1"} err="failed to get container status \"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1\": rpc error: code = NotFound desc = could not find container \"85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1\": container with ID starting with 85032f82a2510580fca984607e7cc8c8ee25d88d368f44d5d8b733f54bb879d1 not found: ID does not exist" Apr 20 20:15:11.721455 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.721440 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.725027 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.724976 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 20:15:11.725027 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.724989 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 20:15:11.725027 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.725013 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 20:15:11.725315 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.725079 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 20:15:11.725315 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.725177 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 20:15:11.725406 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.725373 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-fc9zh\"" Apr 20 20:15:11.725487 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.725468 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 20:15:11.725612 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.725500 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 20:15:11.725686 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.725616 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-295oct4p20ok3\"" Apr 20 20:15:11.725805 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.725788 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 20:15:11.725869 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.725830 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 20:15:11.725869 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.725841 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 20:15:11.725869 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.725860 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 20:15:11.727927 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.727906 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 20:15:11.730572 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.730553 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 20:15:11.734252 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.734234 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:15:11.883364 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883309 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883364 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883337 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-web-config\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883364 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883353 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/26256eac-6ede-44cd-8112-507446876176-config-out\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883506 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883368 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/26256eac-6ede-44cd-8112-507446876176-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883506 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883429 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26256eac-6ede-44cd-8112-507446876176-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883506 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883490 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883624 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883520 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883624 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883550 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/26256eac-6ede-44cd-8112-507446876176-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883624 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883601 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883624 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883621 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883744 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883636 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26256eac-6ede-44cd-8112-507446876176-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883744 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883665 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55pph\" (UniqueName: \"kubernetes.io/projected/26256eac-6ede-44cd-8112-507446876176-kube-api-access-55pph\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883744 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883683 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883744 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26256eac-6ede-44cd-8112-507446876176-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883744 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883726 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26256eac-6ede-44cd-8112-507446876176-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883882 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883745 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-config\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883882 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883773 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.883882 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.883795 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/26256eac-6ede-44cd-8112-507446876176-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.889962 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.889941 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2286408f-eeca-4f7e-8808-a066d991c09e" path="/var/lib/kubelet/pods/2286408f-eeca-4f7e-8808-a066d991c09e/volumes" Apr 20 20:15:11.984792 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.984771 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-config\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.984859 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.984807 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.984859 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.984825 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/26256eac-6ede-44cd-8112-507446876176-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.984859 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.984846 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.984987 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.984970 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-web-config\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.985022 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.985006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/26256eac-6ede-44cd-8112-507446876176-config-out\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.985065 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.985035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/26256eac-6ede-44cd-8112-507446876176-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.985191 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.985163 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26256eac-6ede-44cd-8112-507446876176-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.985299 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.985202 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.985299 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.985248 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.985299 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.985295 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/26256eac-6ede-44cd-8112-507446876176-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.985457 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.985326 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.985457 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.985353 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.985457 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.985380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26256eac-6ede-44cd-8112-507446876176-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.985457 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.985407 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55pph\" (UniqueName: \"kubernetes.io/projected/26256eac-6ede-44cd-8112-507446876176-kube-api-access-55pph\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.985457 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.985435 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.985919 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.985464 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26256eac-6ede-44cd-8112-507446876176-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.985919 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.985502 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26256eac-6ede-44cd-8112-507446876176-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.986417 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.986388 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26256eac-6ede-44cd-8112-507446876176-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990028 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.987013 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26256eac-6ede-44cd-8112-507446876176-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990028 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.987875 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-config\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990028 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.987881 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/26256eac-6ede-44cd-8112-507446876176-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990028 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.987976 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-web-config\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990028 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.988069 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/26256eac-6ede-44cd-8112-507446876176-config-out\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990028 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.988219 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990028 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.988236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/26256eac-6ede-44cd-8112-507446876176-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990028 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.988477 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990028 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.988796 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26256eac-6ede-44cd-8112-507446876176-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990028 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.989637 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26256eac-6ede-44cd-8112-507446876176-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990028 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.989975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990567 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.990074 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/26256eac-6ede-44cd-8112-507446876176-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990567 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.990472 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990700 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.990672 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990735 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.990685 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.990864 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.990848 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/26256eac-6ede-44cd-8112-507446876176-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:11.995515 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:11.995498 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55pph\" (UniqueName: \"kubernetes.io/projected/26256eac-6ede-44cd-8112-507446876176-kube-api-access-55pph\") pod \"prometheus-k8s-0\" (UID: \"26256eac-6ede-44cd-8112-507446876176\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:12.031590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:12.031547 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:12.152005 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:12.151930 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:15:12.155762 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:15:12.155732 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26256eac_6ede_44cd_8112_507446876176.slice/crio-88d2ef1a86f28d4c50c61c78f7be056c4c4efbde0b5fbaccafcafca7c5018833 WatchSource:0}: Error finding container 88d2ef1a86f28d4c50c61c78f7be056c4c4efbde0b5fbaccafcafca7c5018833: Status 404 returned error can't find the container with id 88d2ef1a86f28d4c50c61c78f7be056c4c4efbde0b5fbaccafcafca7c5018833 Apr 20 20:15:12.669623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:12.669592 2578 generic.go:358] "Generic (PLEG): container finished" podID="26256eac-6ede-44cd-8112-507446876176" containerID="5ebab1e0d2e73701430fe3282ad1716a0cb70cc54bb3dae83537ca5f30f46b95" exitCode=0 Apr 20 20:15:12.669949 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:12.669640 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26256eac-6ede-44cd-8112-507446876176","Type":"ContainerDied","Data":"5ebab1e0d2e73701430fe3282ad1716a0cb70cc54bb3dae83537ca5f30f46b95"} Apr 20 20:15:12.669949 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:12.669662 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26256eac-6ede-44cd-8112-507446876176","Type":"ContainerStarted","Data":"88d2ef1a86f28d4c50c61c78f7be056c4c4efbde0b5fbaccafcafca7c5018833"} Apr 20 20:15:13.676136 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:13.676101 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26256eac-6ede-44cd-8112-507446876176","Type":"ContainerStarted","Data":"fc1e56dc2518f9d1448ef1369d6eb02f2c52d4a6efcd1a2a8a1418a617878daf"} Apr 20 20:15:13.676136 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:13.676137 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26256eac-6ede-44cd-8112-507446876176","Type":"ContainerStarted","Data":"46b9912686cfe80b27246d9ad91fb66a2528ac6186468140344efb88b48c0a6e"} Apr 20 20:15:13.676526 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:13.676147 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26256eac-6ede-44cd-8112-507446876176","Type":"ContainerStarted","Data":"6dada8e0abf8822c9534f66a556b0ab8b13565d1419bce9b63f9e17a0c132f02"} Apr 20 20:15:13.676526 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:13.676156 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26256eac-6ede-44cd-8112-507446876176","Type":"ContainerStarted","Data":"905dc26d19ccc0eae21ba3813b887e1d6a465108b60e6e81b50fdd2fc64e1865"} Apr 20 20:15:13.676526 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:13.676165 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26256eac-6ede-44cd-8112-507446876176","Type":"ContainerStarted","Data":"3f95174b11606fc0a22280f9c3cdea5623046ecf8255255ede48722eb9b34317"} Apr 20 20:15:13.676526 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:13.676174 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"26256eac-6ede-44cd-8112-507446876176","Type":"ContainerStarted","Data":"ab0c8ee9a68d0c4a6f4f41707f44fd021a810adb15e658350ac524196d56413a"} Apr 20 20:15:13.706806 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:13.706754 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.7067374749999997 podStartE2EDuration="2.706737475s" podCreationTimestamp="2026-04-20 20:15:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:15:13.704621783 +0000 UTC m=+246.382888841" watchObservedRunningTime="2026-04-20 20:15:13.706737475 +0000 UTC m=+246.385004531" Apr 20 20:15:17.032071 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:17.032024 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:15:19.742818 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:19.742779 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs\") pod \"network-metrics-daemon-zc8kt\" (UID: \"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d\") " pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:15:19.745389 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:19.745358 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d-metrics-certs\") pod \"network-metrics-daemon-zc8kt\" (UID: \"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d\") " pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:15:19.990920 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:19.990895 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-779fg\"" Apr 20 20:15:19.998975 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:19.998927 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zc8kt" Apr 20 20:15:20.135233 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:20.135203 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zc8kt"] Apr 20 20:15:20.138721 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:15:20.138695 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5778b2a_3cb2_40b9_a6ac_ba4bc47b3b7d.slice/crio-4f53edd04bbf14b5be8aaa055884997a6aad87d70e7cadd24e7ce8e5f7ac97f5 WatchSource:0}: Error finding container 4f53edd04bbf14b5be8aaa055884997a6aad87d70e7cadd24e7ce8e5f7ac97f5: Status 404 returned error can't find the container with id 4f53edd04bbf14b5be8aaa055884997a6aad87d70e7cadd24e7ce8e5f7ac97f5 Apr 20 20:15:20.698321 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:20.698284 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zc8kt" event={"ID":"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d","Type":"ContainerStarted","Data":"4f53edd04bbf14b5be8aaa055884997a6aad87d70e7cadd24e7ce8e5f7ac97f5"} Apr 20 20:15:21.702556 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:21.702526 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zc8kt" event={"ID":"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d","Type":"ContainerStarted","Data":"4ef58795b5cadb4d6fdefc2428abeb81a427ac88350e9b2e3b8148d47e023c7e"} Apr 20 20:15:21.702556 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:21.702561 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zc8kt" event={"ID":"a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d","Type":"ContainerStarted","Data":"278edc44f3ae572bb803a2f317626fd7428249171ae3c23dccb95140aedef518"} Apr 20 20:15:21.720088 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:15:21.720005 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zc8kt" podStartSLOduration=252.796764604 podStartE2EDuration="4m13.719989516s" podCreationTimestamp="2026-04-20 20:11:08 +0000 UTC" firstStartedPulling="2026-04-20 20:15:20.140870956 +0000 UTC m=+252.819137991" lastFinishedPulling="2026-04-20 20:15:21.064095861 +0000 UTC m=+253.742362903" observedRunningTime="2026-04-20 20:15:21.719944009 +0000 UTC m=+254.398211068" watchObservedRunningTime="2026-04-20 20:15:21.719989516 +0000 UTC m=+254.398256573" Apr 20 20:16:07.761995 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:07.761963 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:16:07.762467 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:07.762076 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:16:07.768251 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:07.768229 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 20:16:12.032704 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:12.032674 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:16:12.047962 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:12.047939 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:16:12.869116 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:12.869088 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:16:40.247774 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.247726 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-6mcfk"] Apr 20 20:16:40.251356 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.251327 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-6mcfk" Apr 20 20:16:40.253737 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.253701 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 20:16:40.253858 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.253768 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-ptwbj\"" Apr 20 20:16:40.254825 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.254803 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 20:16:40.256980 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.256955 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-6mcfk"] Apr 20 20:16:40.283777 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.283751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rtdv\" (UniqueName: \"kubernetes.io/projected/6346adff-2675-4d6a-8378-3a8a1c9820a5-kube-api-access-4rtdv\") pod \"cert-manager-webhook-587ccfb98-6mcfk\" (UID: \"6346adff-2675-4d6a-8378-3a8a1c9820a5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-6mcfk" Apr 20 20:16:40.283873 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.283810 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6346adff-2675-4d6a-8378-3a8a1c9820a5-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-6mcfk\" (UID: \"6346adff-2675-4d6a-8378-3a8a1c9820a5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-6mcfk" Apr 20 20:16:40.385079 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.385054 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rtdv\" (UniqueName: \"kubernetes.io/projected/6346adff-2675-4d6a-8378-3a8a1c9820a5-kube-api-access-4rtdv\") pod \"cert-manager-webhook-587ccfb98-6mcfk\" (UID: \"6346adff-2675-4d6a-8378-3a8a1c9820a5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-6mcfk" Apr 20 20:16:40.385232 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.385113 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6346adff-2675-4d6a-8378-3a8a1c9820a5-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-6mcfk\" (UID: \"6346adff-2675-4d6a-8378-3a8a1c9820a5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-6mcfk" Apr 20 20:16:40.393151 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.393129 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rtdv\" (UniqueName: \"kubernetes.io/projected/6346adff-2675-4d6a-8378-3a8a1c9820a5-kube-api-access-4rtdv\") pod \"cert-manager-webhook-587ccfb98-6mcfk\" (UID: \"6346adff-2675-4d6a-8378-3a8a1c9820a5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-6mcfk" Apr 20 20:16:40.393151 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.393146 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6346adff-2675-4d6a-8378-3a8a1c9820a5-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-6mcfk\" (UID: \"6346adff-2675-4d6a-8378-3a8a1c9820a5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-6mcfk" Apr 20 20:16:40.577046 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.576992 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-6mcfk" Apr 20 20:16:40.693664 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.693633 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-6mcfk"] Apr 20 20:16:40.696534 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:16:40.696495 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6346adff_2675_4d6a_8378_3a8a1c9820a5.slice/crio-e86a2dae0772557b3119bcc6173f216861ed4cbde8ae18daa73c3a7f23cfb70e WatchSource:0}: Error finding container e86a2dae0772557b3119bcc6173f216861ed4cbde8ae18daa73c3a7f23cfb70e: Status 404 returned error can't find the container with id e86a2dae0772557b3119bcc6173f216861ed4cbde8ae18daa73c3a7f23cfb70e Apr 20 20:16:40.698560 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.698543 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:16:40.931451 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:40.931421 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-6mcfk" event={"ID":"6346adff-2675-4d6a-8378-3a8a1c9820a5","Type":"ContainerStarted","Data":"e86a2dae0772557b3119bcc6173f216861ed4cbde8ae18daa73c3a7f23cfb70e"} Apr 20 20:16:43.942518 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:43.942483 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-6mcfk" event={"ID":"6346adff-2675-4d6a-8378-3a8a1c9820a5","Type":"ContainerStarted","Data":"c5daab90dd86b405f6ef4cf73e462eb47bbda05f917354d29b9384436d09e9ba"} Apr 20 20:16:43.942925 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:43.942539 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-6mcfk" Apr 20 20:16:43.962529 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:43.962480 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-6mcfk" podStartSLOduration=1.022595193 podStartE2EDuration="3.962466677s" podCreationTimestamp="2026-04-20 20:16:40 +0000 UTC" firstStartedPulling="2026-04-20 20:16:40.698719508 +0000 UTC m=+333.376986545" lastFinishedPulling="2026-04-20 20:16:43.638590975 +0000 UTC m=+336.316858029" observedRunningTime="2026-04-20 20:16:43.961634446 +0000 UTC m=+336.639901497" watchObservedRunningTime="2026-04-20 20:16:43.962466677 +0000 UTC m=+336.640733734" Apr 20 20:16:46.642174 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:46.642125 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-vx4zj"] Apr 20 20:16:46.645527 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:46.645498 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vx4zj" Apr 20 20:16:46.648199 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:46.648165 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 20:16:46.648328 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:46.648166 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:16:46.649390 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:46.649365 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-2brx9\"" Apr 20 20:16:46.651952 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:46.651922 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-vx4zj"] Apr 20 20:16:46.734383 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:46.734358 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0643c57f-304b-4062-90f3-7099c45b7e07-tmp\") pod \"openshift-lws-operator-bfc7f696d-vx4zj\" (UID: \"0643c57f-304b-4062-90f3-7099c45b7e07\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vx4zj" Apr 20 20:16:46.734481 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:46.734402 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6p2m\" (UniqueName: \"kubernetes.io/projected/0643c57f-304b-4062-90f3-7099c45b7e07-kube-api-access-l6p2m\") pod \"openshift-lws-operator-bfc7f696d-vx4zj\" (UID: \"0643c57f-304b-4062-90f3-7099c45b7e07\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vx4zj" Apr 20 20:16:46.835586 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:46.835562 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0643c57f-304b-4062-90f3-7099c45b7e07-tmp\") pod \"openshift-lws-operator-bfc7f696d-vx4zj\" (UID: \"0643c57f-304b-4062-90f3-7099c45b7e07\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vx4zj" Apr 20 20:16:46.835687 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:46.835624 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6p2m\" (UniqueName: \"kubernetes.io/projected/0643c57f-304b-4062-90f3-7099c45b7e07-kube-api-access-l6p2m\") pod \"openshift-lws-operator-bfc7f696d-vx4zj\" (UID: \"0643c57f-304b-4062-90f3-7099c45b7e07\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vx4zj" Apr 20 20:16:46.835924 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:46.835905 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0643c57f-304b-4062-90f3-7099c45b7e07-tmp\") pod \"openshift-lws-operator-bfc7f696d-vx4zj\" (UID: \"0643c57f-304b-4062-90f3-7099c45b7e07\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vx4zj" Apr 20 20:16:46.844005 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:46.843975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6p2m\" (UniqueName: \"kubernetes.io/projected/0643c57f-304b-4062-90f3-7099c45b7e07-kube-api-access-l6p2m\") pod \"openshift-lws-operator-bfc7f696d-vx4zj\" (UID: \"0643c57f-304b-4062-90f3-7099c45b7e07\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vx4zj" Apr 20 20:16:46.956905 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:46.956880 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vx4zj" Apr 20 20:16:47.081088 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:47.081061 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-vx4zj"] Apr 20 20:16:47.083203 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:16:47.083178 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0643c57f_304b_4062_90f3_7099c45b7e07.slice/crio-9ae83ae82fdb01faa1d6d6cbe457ff4c0fa8553955f9a0109f328ead05e7cb39 WatchSource:0}: Error finding container 9ae83ae82fdb01faa1d6d6cbe457ff4c0fa8553955f9a0109f328ead05e7cb39: Status 404 returned error can't find the container with id 9ae83ae82fdb01faa1d6d6cbe457ff4c0fa8553955f9a0109f328ead05e7cb39 Apr 20 20:16:47.958903 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:47.958852 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vx4zj" event={"ID":"0643c57f-304b-4062-90f3-7099c45b7e07","Type":"ContainerStarted","Data":"9ae83ae82fdb01faa1d6d6cbe457ff4c0fa8553955f9a0109f328ead05e7cb39"} Apr 20 20:16:49.948153 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:49.948127 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-6mcfk" Apr 20 20:16:49.966698 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:49.966663 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vx4zj" event={"ID":"0643c57f-304b-4062-90f3-7099c45b7e07","Type":"ContainerStarted","Data":"6d3086d7b2d1fe24f90d48b1a62b37e14ff40dd9dabcbdc235441806f67d77ab"} Apr 20 20:16:49.984849 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:16:49.984794 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vx4zj" podStartSLOduration=1.636074571 podStartE2EDuration="3.984777721s" podCreationTimestamp="2026-04-20 20:16:46 +0000 UTC" firstStartedPulling="2026-04-20 20:16:47.084727221 +0000 UTC m=+339.762994255" lastFinishedPulling="2026-04-20 20:16:49.433430368 +0000 UTC m=+342.111697405" observedRunningTime="2026-04-20 20:16:49.983490298 +0000 UTC m=+342.661757356" watchObservedRunningTime="2026-04-20 20:16:49.984777721 +0000 UTC m=+342.663044779" Apr 20 20:17:11.026142 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.026052 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn"] Apr 20 20:17:11.035754 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.035730 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" Apr 20 20:17:11.039903 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.039883 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pq7n9\"" Apr 20 20:17:11.039903 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.039893 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 20:17:11.040058 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.039893 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 20:17:11.040190 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.040160 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 20:17:11.040190 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.040183 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 20:17:11.044681 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.044663 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn"] Apr 20 20:17:11.105652 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.105625 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6be6a96a-5585-4d1d-b02b-8ab01176b90b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-747pn\" (UID: \"6be6a96a-5585-4d1d-b02b-8ab01176b90b\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" Apr 20 20:17:11.105756 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.105680 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtcmk\" (UniqueName: \"kubernetes.io/projected/6be6a96a-5585-4d1d-b02b-8ab01176b90b-kube-api-access-qtcmk\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-747pn\" (UID: \"6be6a96a-5585-4d1d-b02b-8ab01176b90b\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" Apr 20 20:17:11.105756 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.105737 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6be6a96a-5585-4d1d-b02b-8ab01176b90b-webhook-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-747pn\" (UID: \"6be6a96a-5585-4d1d-b02b-8ab01176b90b\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" Apr 20 20:17:11.206530 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.206503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6be6a96a-5585-4d1d-b02b-8ab01176b90b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-747pn\" (UID: \"6be6a96a-5585-4d1d-b02b-8ab01176b90b\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" Apr 20 20:17:11.206649 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.206549 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtcmk\" (UniqueName: \"kubernetes.io/projected/6be6a96a-5585-4d1d-b02b-8ab01176b90b-kube-api-access-qtcmk\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-747pn\" (UID: \"6be6a96a-5585-4d1d-b02b-8ab01176b90b\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" Apr 20 20:17:11.206649 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.206605 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6be6a96a-5585-4d1d-b02b-8ab01176b90b-webhook-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-747pn\" (UID: \"6be6a96a-5585-4d1d-b02b-8ab01176b90b\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" Apr 20 20:17:11.209282 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.209260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6be6a96a-5585-4d1d-b02b-8ab01176b90b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-747pn\" (UID: \"6be6a96a-5585-4d1d-b02b-8ab01176b90b\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" Apr 20 20:17:11.209386 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.209265 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6be6a96a-5585-4d1d-b02b-8ab01176b90b-webhook-cert\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-747pn\" (UID: \"6be6a96a-5585-4d1d-b02b-8ab01176b90b\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" Apr 20 20:17:11.214805 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.214763 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtcmk\" (UniqueName: \"kubernetes.io/projected/6be6a96a-5585-4d1d-b02b-8ab01176b90b-kube-api-access-qtcmk\") pod \"opendatahub-operator-controller-manager-7f7bf89c4-747pn\" (UID: \"6be6a96a-5585-4d1d-b02b-8ab01176b90b\") " pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" Apr 20 20:17:11.345907 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.345857 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" Apr 20 20:17:11.487568 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:11.487542 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn"] Apr 20 20:17:11.489069 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:17:11.489038 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6be6a96a_5585_4d1d_b02b_8ab01176b90b.slice/crio-a483c56859a3a8f4754f2728f2bf9244da11312f08c6afa5ccdad6c9ed1def3a WatchSource:0}: Error finding container a483c56859a3a8f4754f2728f2bf9244da11312f08c6afa5ccdad6c9ed1def3a: Status 404 returned error can't find the container with id a483c56859a3a8f4754f2728f2bf9244da11312f08c6afa5ccdad6c9ed1def3a Apr 20 20:17:12.025350 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:12.025313 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" event={"ID":"6be6a96a-5585-4d1d-b02b-8ab01176b90b","Type":"ContainerStarted","Data":"a483c56859a3a8f4754f2728f2bf9244da11312f08c6afa5ccdad6c9ed1def3a"} Apr 20 20:17:14.035033 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:14.035005 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" event={"ID":"6be6a96a-5585-4d1d-b02b-8ab01176b90b","Type":"ContainerStarted","Data":"3403d60c3b25eb4834a09bd413a4ea842206c0eeb4dcba4e5c41a28fe5eb293c"} Apr 20 20:17:14.035326 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:14.035085 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" Apr 20 20:17:14.057015 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:14.056972 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" podStartSLOduration=0.587702285 podStartE2EDuration="3.056958434s" podCreationTimestamp="2026-04-20 20:17:11 +0000 UTC" firstStartedPulling="2026-04-20 20:17:11.490958863 +0000 UTC m=+364.169225921" lastFinishedPulling="2026-04-20 20:17:13.960215028 +0000 UTC m=+366.638482070" observedRunningTime="2026-04-20 20:17:14.055668864 +0000 UTC m=+366.733935926" watchObservedRunningTime="2026-04-20 20:17:14.056958434 +0000 UTC m=+366.735225490" Apr 20 20:17:16.840799 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:16.840769 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t"] Apr 20 20:17:16.844262 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:16.844244 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:16.846834 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:16.846747 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 20:17:16.847869 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:16.847849 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 20:17:16.847960 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:16.847858 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-z5t9d\"" Apr 20 20:17:16.848022 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:16.847960 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 20:17:16.852910 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:16.852882 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t"] Apr 20 20:17:16.955123 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:16.955097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/edabbbcf-9ce5-4f2a-a60a-5f45697c0006-manager-config\") pod \"lws-controller-manager-54d459c768-k9r6t\" (UID: \"edabbbcf-9ce5-4f2a-a60a-5f45697c0006\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:16.955240 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:16.955130 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/edabbbcf-9ce5-4f2a-a60a-5f45697c0006-cert\") pod \"lws-controller-manager-54d459c768-k9r6t\" (UID: \"edabbbcf-9ce5-4f2a-a60a-5f45697c0006\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:16.955240 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:16.955161 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmblk\" (UniqueName: \"kubernetes.io/projected/edabbbcf-9ce5-4f2a-a60a-5f45697c0006-kube-api-access-vmblk\") pod \"lws-controller-manager-54d459c768-k9r6t\" (UID: \"edabbbcf-9ce5-4f2a-a60a-5f45697c0006\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:16.955240 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:16.955221 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/edabbbcf-9ce5-4f2a-a60a-5f45697c0006-metrics-cert\") pod \"lws-controller-manager-54d459c768-k9r6t\" (UID: \"edabbbcf-9ce5-4f2a-a60a-5f45697c0006\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:17.056505 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:17.056477 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmblk\" (UniqueName: \"kubernetes.io/projected/edabbbcf-9ce5-4f2a-a60a-5f45697c0006-kube-api-access-vmblk\") pod \"lws-controller-manager-54d459c768-k9r6t\" (UID: \"edabbbcf-9ce5-4f2a-a60a-5f45697c0006\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:17.056620 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:17.056514 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/edabbbcf-9ce5-4f2a-a60a-5f45697c0006-metrics-cert\") pod \"lws-controller-manager-54d459c768-k9r6t\" (UID: \"edabbbcf-9ce5-4f2a-a60a-5f45697c0006\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:17.056620 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:17.056549 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/edabbbcf-9ce5-4f2a-a60a-5f45697c0006-manager-config\") pod \"lws-controller-manager-54d459c768-k9r6t\" (UID: \"edabbbcf-9ce5-4f2a-a60a-5f45697c0006\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:17.056620 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:17.056594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/edabbbcf-9ce5-4f2a-a60a-5f45697c0006-cert\") pod \"lws-controller-manager-54d459c768-k9r6t\" (UID: \"edabbbcf-9ce5-4f2a-a60a-5f45697c0006\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:17.057106 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:17.057086 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/edabbbcf-9ce5-4f2a-a60a-5f45697c0006-manager-config\") pod \"lws-controller-manager-54d459c768-k9r6t\" (UID: \"edabbbcf-9ce5-4f2a-a60a-5f45697c0006\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:17.058958 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:17.058935 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/edabbbcf-9ce5-4f2a-a60a-5f45697c0006-cert\") pod \"lws-controller-manager-54d459c768-k9r6t\" (UID: \"edabbbcf-9ce5-4f2a-a60a-5f45697c0006\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:17.059047 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:17.058991 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/edabbbcf-9ce5-4f2a-a60a-5f45697c0006-metrics-cert\") pod \"lws-controller-manager-54d459c768-k9r6t\" (UID: \"edabbbcf-9ce5-4f2a-a60a-5f45697c0006\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:17.065615 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:17.065597 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmblk\" (UniqueName: \"kubernetes.io/projected/edabbbcf-9ce5-4f2a-a60a-5f45697c0006-kube-api-access-vmblk\") pod \"lws-controller-manager-54d459c768-k9r6t\" (UID: \"edabbbcf-9ce5-4f2a-a60a-5f45697c0006\") " pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:17.154621 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:17.154553 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:17.275560 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:17.275532 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t"] Apr 20 20:17:17.278305 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:17:17.278276 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedabbbcf_9ce5_4f2a_a60a_5f45697c0006.slice/crio-130faae0baa5d679e409f5fa130f1bb725473d8560fde74030b54b181ae18355 WatchSource:0}: Error finding container 130faae0baa5d679e409f5fa130f1bb725473d8560fde74030b54b181ae18355: Status 404 returned error can't find the container with id 130faae0baa5d679e409f5fa130f1bb725473d8560fde74030b54b181ae18355 Apr 20 20:17:18.050153 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:18.050117 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" event={"ID":"edabbbcf-9ce5-4f2a-a60a-5f45697c0006","Type":"ContainerStarted","Data":"130faae0baa5d679e409f5fa130f1bb725473d8560fde74030b54b181ae18355"} Apr 20 20:17:19.055051 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:19.054966 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" event={"ID":"edabbbcf-9ce5-4f2a-a60a-5f45697c0006","Type":"ContainerStarted","Data":"59f99d2420e8277b6755708832f096b874034521cb1323a7e59e6ff7c484bdbc"} Apr 20 20:17:19.055413 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:19.055093 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:19.072137 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:19.072084 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" podStartSLOduration=1.675209845 podStartE2EDuration="3.072072024s" podCreationTimestamp="2026-04-20 20:17:16 +0000 UTC" firstStartedPulling="2026-04-20 20:17:17.280198525 +0000 UTC m=+369.958465570" lastFinishedPulling="2026-04-20 20:17:18.677060701 +0000 UTC m=+371.355327749" observedRunningTime="2026-04-20 20:17:19.070923072 +0000 UTC m=+371.749190140" watchObservedRunningTime="2026-04-20 20:17:19.072072024 +0000 UTC m=+371.750339122" Apr 20 20:17:25.041176 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:25.041148 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-7f7bf89c4-747pn" Apr 20 20:17:30.060382 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:30.060350 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-54d459c768-k9r6t" Apr 20 20:17:57.136083 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.136053 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh"] Apr 20 20:17:57.138666 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.138644 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.141223 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.141194 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 20:17:57.141364 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.141265 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-dqhtw\"" Apr 20 20:17:57.141364 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.141194 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 20:17:57.141364 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.141292 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 20:17:57.150695 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.150636 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh"] Apr 20 20:17:57.237159 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.237127 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.237260 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.237179 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.237260 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.237248 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.237363 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.237275 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.237363 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.237322 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.237544 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.237429 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.237544 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.237469 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.237544 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.237499 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsgpk\" (UniqueName: \"kubernetes.io/projected/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-kube-api-access-gsgpk\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.237668 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.237545 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.338208 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.338169 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.338373 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.338232 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.338373 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.338268 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.338373 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.338293 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.338373 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.338349 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.338602 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.338390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.338602 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.338423 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.338602 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.338452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsgpk\" (UniqueName: \"kubernetes.io/projected/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-kube-api-access-gsgpk\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.338602 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.338488 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.339051 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.339027 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.339168 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.339082 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.339304 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.339280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.339393 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.339370 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.339671 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.339646 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.341120 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.341090 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.341478 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.341461 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.350182 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.350154 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.350571 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.350553 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsgpk\" (UniqueName: \"kubernetes.io/projected/f8bc649e-1d44-46a6-ac5a-562503a3a0e6-kube-api-access-gsgpk\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fd28nh\" (UID: \"f8bc649e-1d44-46a6-ac5a-562503a3a0e6\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.451719 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.451699 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:17:57.573385 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:57.573354 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh"] Apr 20 20:17:57.576201 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:17:57.576173 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8bc649e_1d44_46a6_ac5a_562503a3a0e6.slice/crio-cac2806fcc06ee18fd05aa3f64074b8c46f59866aeb417f804845b76bab10a35 WatchSource:0}: Error finding container cac2806fcc06ee18fd05aa3f64074b8c46f59866aeb417f804845b76bab10a35: Status 404 returned error can't find the container with id cac2806fcc06ee18fd05aa3f64074b8c46f59866aeb417f804845b76bab10a35 Apr 20 20:17:58.175142 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:17:58.175114 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" event={"ID":"f8bc649e-1d44-46a6-ac5a-562503a3a0e6","Type":"ContainerStarted","Data":"cac2806fcc06ee18fd05aa3f64074b8c46f59866aeb417f804845b76bab10a35"} Apr 20 20:18:00.381519 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:00.381479 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 20 20:18:00.381837 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:00.381556 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 20 20:18:00.381837 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:00.381604 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 20 20:18:01.188592 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:01.188538 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" event={"ID":"f8bc649e-1d44-46a6-ac5a-562503a3a0e6","Type":"ContainerStarted","Data":"c0c9499df3b2a088da290ad22d9f8754102126ac4de5d8335cbbaa6c69835134"} Apr 20 20:18:01.209054 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:01.209006 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" podStartSLOduration=1.405875508 podStartE2EDuration="4.20898025s" podCreationTimestamp="2026-04-20 20:17:57 +0000 UTC" firstStartedPulling="2026-04-20 20:17:57.57814366 +0000 UTC m=+410.256410696" lastFinishedPulling="2026-04-20 20:18:00.3812484 +0000 UTC m=+413.059515438" observedRunningTime="2026-04-20 20:18:01.207437076 +0000 UTC m=+413.885704153" watchObservedRunningTime="2026-04-20 20:18:01.20898025 +0000 UTC m=+413.887247306" Apr 20 20:18:01.452982 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:01.452911 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:18:01.457468 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:01.457445 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:18:02.192286 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:02.192249 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:18:02.193333 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:02.193312 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fd28nh" Apr 20 20:18:28.833122 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:28.833046 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-45vb7"] Apr 20 20:18:28.836188 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:28.836170 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-45vb7" Apr 20 20:18:28.838914 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:28.838895 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 20:18:28.839779 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:28.839758 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-lkfnx\"" Apr 20 20:18:28.839902 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:28.839761 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 20:18:28.842548 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:28.842524 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-45vb7"] Apr 20 20:18:28.959776 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:28.959742 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn9bp\" (UniqueName: \"kubernetes.io/projected/004a0099-5ae9-4d17-a642-c54df7d35e98-kube-api-access-cn9bp\") pod \"kuadrant-operator-catalog-45vb7\" (UID: \"004a0099-5ae9-4d17-a642-c54df7d35e98\") " pod="kuadrant-system/kuadrant-operator-catalog-45vb7" Apr 20 20:18:29.060288 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:29.060262 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cn9bp\" (UniqueName: \"kubernetes.io/projected/004a0099-5ae9-4d17-a642-c54df7d35e98-kube-api-access-cn9bp\") pod \"kuadrant-operator-catalog-45vb7\" (UID: \"004a0099-5ae9-4d17-a642-c54df7d35e98\") " pod="kuadrant-system/kuadrant-operator-catalog-45vb7" Apr 20 20:18:29.067863 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:29.067836 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn9bp\" (UniqueName: \"kubernetes.io/projected/004a0099-5ae9-4d17-a642-c54df7d35e98-kube-api-access-cn9bp\") pod \"kuadrant-operator-catalog-45vb7\" (UID: \"004a0099-5ae9-4d17-a642-c54df7d35e98\") " pod="kuadrant-system/kuadrant-operator-catalog-45vb7" Apr 20 20:18:29.146765 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:29.146710 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-45vb7" Apr 20 20:18:29.199430 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:29.199390 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-45vb7"] Apr 20 20:18:29.264960 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:29.264937 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-45vb7"] Apr 20 20:18:29.267650 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:18:29.267619 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod004a0099_5ae9_4d17_a642_c54df7d35e98.slice/crio-b8c6e667586cdc77f19e206cf596112a3652bea7db0e57a5dc4bc9643ea6998f WatchSource:0}: Error finding container b8c6e667586cdc77f19e206cf596112a3652bea7db0e57a5dc4bc9643ea6998f: Status 404 returned error can't find the container with id b8c6e667586cdc77f19e206cf596112a3652bea7db0e57a5dc4bc9643ea6998f Apr 20 20:18:29.285396 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:29.285371 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-45vb7" event={"ID":"004a0099-5ae9-4d17-a642-c54df7d35e98","Type":"ContainerStarted","Data":"b8c6e667586cdc77f19e206cf596112a3652bea7db0e57a5dc4bc9643ea6998f"} Apr 20 20:18:29.408525 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:29.408497 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-64ngl"] Apr 20 20:18:29.412699 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:29.412684 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-64ngl" Apr 20 20:18:29.418011 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:29.417990 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-64ngl"] Apr 20 20:18:29.463321 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:29.463301 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4wnc\" (UniqueName: \"kubernetes.io/projected/599f74c2-9421-424b-9b12-072566336497-kube-api-access-m4wnc\") pod \"kuadrant-operator-catalog-64ngl\" (UID: \"599f74c2-9421-424b-9b12-072566336497\") " pod="kuadrant-system/kuadrant-operator-catalog-64ngl" Apr 20 20:18:29.564049 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:29.564024 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4wnc\" (UniqueName: \"kubernetes.io/projected/599f74c2-9421-424b-9b12-072566336497-kube-api-access-m4wnc\") pod \"kuadrant-operator-catalog-64ngl\" (UID: \"599f74c2-9421-424b-9b12-072566336497\") " pod="kuadrant-system/kuadrant-operator-catalog-64ngl" Apr 20 20:18:29.572095 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:29.572075 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4wnc\" (UniqueName: \"kubernetes.io/projected/599f74c2-9421-424b-9b12-072566336497-kube-api-access-m4wnc\") pod \"kuadrant-operator-catalog-64ngl\" (UID: \"599f74c2-9421-424b-9b12-072566336497\") " pod="kuadrant-system/kuadrant-operator-catalog-64ngl" Apr 20 20:18:29.723194 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:29.723110 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-64ngl" Apr 20 20:18:29.893470 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:29.893446 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-64ngl"] Apr 20 20:18:29.912389 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:18:29.912366 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod599f74c2_9421_424b_9b12_072566336497.slice/crio-1ca98545fe7ff023fa232e8cc931b7e0d1eb08739dd00d3c9189f15268e0e0b0 WatchSource:0}: Error finding container 1ca98545fe7ff023fa232e8cc931b7e0d1eb08739dd00d3c9189f15268e0e0b0: Status 404 returned error can't find the container with id 1ca98545fe7ff023fa232e8cc931b7e0d1eb08739dd00d3c9189f15268e0e0b0 Apr 20 20:18:30.289562 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:30.289532 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-64ngl" event={"ID":"599f74c2-9421-424b-9b12-072566336497","Type":"ContainerStarted","Data":"1ca98545fe7ff023fa232e8cc931b7e0d1eb08739dd00d3c9189f15268e0e0b0"} Apr 20 20:18:32.298791 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:32.298752 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-45vb7" event={"ID":"004a0099-5ae9-4d17-a642-c54df7d35e98","Type":"ContainerStarted","Data":"93f04dba38694644fb3229fa4f7ba33280f7093bc24795a2017edae9c58cb8a2"} Apr 20 20:18:32.299329 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:32.298843 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-45vb7" podUID="004a0099-5ae9-4d17-a642-c54df7d35e98" containerName="registry-server" containerID="cri-o://93f04dba38694644fb3229fa4f7ba33280f7093bc24795a2017edae9c58cb8a2" gracePeriod=2 Apr 20 20:18:32.300347 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:32.300325 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-64ngl" event={"ID":"599f74c2-9421-424b-9b12-072566336497","Type":"ContainerStarted","Data":"1ba9f5b8382fcef09d959ca5e40f7e17aab6b47f0dbaf561227f1afe106660b8"} Apr 20 20:18:32.313957 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:32.313870 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-45vb7" podStartSLOduration=2.253558337 podStartE2EDuration="4.313859018s" podCreationTimestamp="2026-04-20 20:18:28 +0000 UTC" firstStartedPulling="2026-04-20 20:18:29.268797353 +0000 UTC m=+441.947064388" lastFinishedPulling="2026-04-20 20:18:31.329098032 +0000 UTC m=+444.007365069" observedRunningTime="2026-04-20 20:18:32.312111192 +0000 UTC m=+444.990378248" watchObservedRunningTime="2026-04-20 20:18:32.313859018 +0000 UTC m=+444.992126087" Apr 20 20:18:32.326639 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:32.326602 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-64ngl" podStartSLOduration=1.892618679 podStartE2EDuration="3.326593957s" podCreationTimestamp="2026-04-20 20:18:29 +0000 UTC" firstStartedPulling="2026-04-20 20:18:29.913743304 +0000 UTC m=+442.592010339" lastFinishedPulling="2026-04-20 20:18:31.347718579 +0000 UTC m=+444.025985617" observedRunningTime="2026-04-20 20:18:32.324805669 +0000 UTC m=+445.003072726" watchObservedRunningTime="2026-04-20 20:18:32.326593957 +0000 UTC m=+445.004861005" Apr 20 20:18:32.529847 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:32.529826 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-45vb7" Apr 20 20:18:32.586682 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:32.586619 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn9bp\" (UniqueName: \"kubernetes.io/projected/004a0099-5ae9-4d17-a642-c54df7d35e98-kube-api-access-cn9bp\") pod \"004a0099-5ae9-4d17-a642-c54df7d35e98\" (UID: \"004a0099-5ae9-4d17-a642-c54df7d35e98\") " Apr 20 20:18:32.588852 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:32.588825 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/004a0099-5ae9-4d17-a642-c54df7d35e98-kube-api-access-cn9bp" (OuterVolumeSpecName: "kube-api-access-cn9bp") pod "004a0099-5ae9-4d17-a642-c54df7d35e98" (UID: "004a0099-5ae9-4d17-a642-c54df7d35e98"). InnerVolumeSpecName "kube-api-access-cn9bp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:18:32.687614 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:32.687571 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cn9bp\" (UniqueName: \"kubernetes.io/projected/004a0099-5ae9-4d17-a642-c54df7d35e98-kube-api-access-cn9bp\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:18:33.304886 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:33.304847 2578 generic.go:358] "Generic (PLEG): container finished" podID="004a0099-5ae9-4d17-a642-c54df7d35e98" containerID="93f04dba38694644fb3229fa4f7ba33280f7093bc24795a2017edae9c58cb8a2" exitCode=0 Apr 20 20:18:33.305283 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:33.304906 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-45vb7" Apr 20 20:18:33.305283 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:33.304905 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-45vb7" event={"ID":"004a0099-5ae9-4d17-a642-c54df7d35e98","Type":"ContainerDied","Data":"93f04dba38694644fb3229fa4f7ba33280f7093bc24795a2017edae9c58cb8a2"} Apr 20 20:18:33.305283 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:33.304950 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-45vb7" event={"ID":"004a0099-5ae9-4d17-a642-c54df7d35e98","Type":"ContainerDied","Data":"b8c6e667586cdc77f19e206cf596112a3652bea7db0e57a5dc4bc9643ea6998f"} Apr 20 20:18:33.305283 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:33.304971 2578 scope.go:117] "RemoveContainer" containerID="93f04dba38694644fb3229fa4f7ba33280f7093bc24795a2017edae9c58cb8a2" Apr 20 20:18:33.313823 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:33.313797 2578 scope.go:117] "RemoveContainer" containerID="93f04dba38694644fb3229fa4f7ba33280f7093bc24795a2017edae9c58cb8a2" Apr 20 20:18:33.314152 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:18:33.314125 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f04dba38694644fb3229fa4f7ba33280f7093bc24795a2017edae9c58cb8a2\": container with ID starting with 93f04dba38694644fb3229fa4f7ba33280f7093bc24795a2017edae9c58cb8a2 not found: ID does not exist" containerID="93f04dba38694644fb3229fa4f7ba33280f7093bc24795a2017edae9c58cb8a2" Apr 20 20:18:33.314218 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:33.314151 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f04dba38694644fb3229fa4f7ba33280f7093bc24795a2017edae9c58cb8a2"} err="failed to get container status \"93f04dba38694644fb3229fa4f7ba33280f7093bc24795a2017edae9c58cb8a2\": rpc error: code = NotFound desc = could not find container \"93f04dba38694644fb3229fa4f7ba33280f7093bc24795a2017edae9c58cb8a2\": container with ID starting with 93f04dba38694644fb3229fa4f7ba33280f7093bc24795a2017edae9c58cb8a2 not found: ID does not exist" Apr 20 20:18:33.324424 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:33.324402 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-45vb7"] Apr 20 20:18:33.327450 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:33.327430 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-45vb7"] Apr 20 20:18:33.891239 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:33.891207 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="004a0099-5ae9-4d17-a642-c54df7d35e98" path="/var/lib/kubelet/pods/004a0099-5ae9-4d17-a642-c54df7d35e98/volumes" Apr 20 20:18:39.724058 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:39.724013 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-64ngl" Apr 20 20:18:39.724442 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:39.724105 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-64ngl" Apr 20 20:18:39.746838 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:39.746815 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-64ngl" Apr 20 20:18:40.381251 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:40.381222 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-64ngl" Apr 20 20:18:59.785994 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:59.785965 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft"] Apr 20 20:18:59.786362 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:59.786278 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="004a0099-5ae9-4d17-a642-c54df7d35e98" containerName="registry-server" Apr 20 20:18:59.786362 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:59.786288 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="004a0099-5ae9-4d17-a642-c54df7d35e98" containerName="registry-server" Apr 20 20:18:59.786362 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:59.786356 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="004a0099-5ae9-4d17-a642-c54df7d35e98" containerName="registry-server" Apr 20 20:18:59.788192 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:59.788177 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" Apr 20 20:18:59.790713 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:59.790692 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-p49ld\"" Apr 20 20:18:59.801056 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:59.801033 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft"] Apr 20 20:18:59.884810 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:59.884788 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btwrj\" (UniqueName: \"kubernetes.io/projected/20b5a918-5d41-4237-9898-3217e6f8c595-kube-api-access-btwrj\") pod \"limitador-operator-controller-manager-85c4996f8c-5dpft\" (UID: \"20b5a918-5d41-4237-9898-3217e6f8c595\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" Apr 20 20:18:59.985750 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:18:59.985723 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btwrj\" (UniqueName: \"kubernetes.io/projected/20b5a918-5d41-4237-9898-3217e6f8c595-kube-api-access-btwrj\") pod \"limitador-operator-controller-manager-85c4996f8c-5dpft\" (UID: \"20b5a918-5d41-4237-9898-3217e6f8c595\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" Apr 20 20:19:00.001390 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:00.001368 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btwrj\" (UniqueName: \"kubernetes.io/projected/20b5a918-5d41-4237-9898-3217e6f8c595-kube-api-access-btwrj\") pod \"limitador-operator-controller-manager-85c4996f8c-5dpft\" (UID: \"20b5a918-5d41-4237-9898-3217e6f8c595\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" Apr 20 20:19:00.098215 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:00.098162 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" Apr 20 20:19:00.218947 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:00.218916 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft"] Apr 20 20:19:00.221442 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:19:00.221416 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20b5a918_5d41_4237_9898_3217e6f8c595.slice/crio-8ecd70c40eab14e72f7731522fe303282cb23171d3705456221581a2cbb32146 WatchSource:0}: Error finding container 8ecd70c40eab14e72f7731522fe303282cb23171d3705456221581a2cbb32146: Status 404 returned error can't find the container with id 8ecd70c40eab14e72f7731522fe303282cb23171d3705456221581a2cbb32146 Apr 20 20:19:00.402787 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:00.402728 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" event={"ID":"20b5a918-5d41-4237-9898-3217e6f8c595","Type":"ContainerStarted","Data":"8ecd70c40eab14e72f7731522fe303282cb23171d3705456221581a2cbb32146"} Apr 20 20:19:02.411077 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:02.411050 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" event={"ID":"20b5a918-5d41-4237-9898-3217e6f8c595","Type":"ContainerStarted","Data":"25fdccaac3a7c866f2b16062988c3ff7735fa67104c2ecb6f155215a898b9ba2"} Apr 20 20:19:02.411395 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:02.411150 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" Apr 20 20:19:02.430558 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:02.430515 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" podStartSLOduration=1.409115349 podStartE2EDuration="3.430501823s" podCreationTimestamp="2026-04-20 20:18:59 +0000 UTC" firstStartedPulling="2026-04-20 20:19:00.223273201 +0000 UTC m=+472.901540237" lastFinishedPulling="2026-04-20 20:19:02.244659674 +0000 UTC m=+474.922926711" observedRunningTime="2026-04-20 20:19:02.429601488 +0000 UTC m=+475.107868543" watchObservedRunningTime="2026-04-20 20:19:02.430501823 +0000 UTC m=+475.108768871" Apr 20 20:19:07.528271 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:07.528241 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-8rj49"] Apr 20 20:19:07.532081 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:07.532059 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-8rj49" Apr 20 20:19:07.534533 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:07.534508 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-4zwq7\"" Apr 20 20:19:07.541528 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:07.541506 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-8rj49"] Apr 20 20:19:07.645478 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:07.645454 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dhgl\" (UniqueName: \"kubernetes.io/projected/67c536d1-256d-465b-8dbb-23442620b388-kube-api-access-6dhgl\") pod \"authorino-operator-657f44b778-8rj49\" (UID: \"67c536d1-256d-465b-8dbb-23442620b388\") " pod="kuadrant-system/authorino-operator-657f44b778-8rj49" Apr 20 20:19:07.746292 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:07.746269 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dhgl\" (UniqueName: \"kubernetes.io/projected/67c536d1-256d-465b-8dbb-23442620b388-kube-api-access-6dhgl\") pod \"authorino-operator-657f44b778-8rj49\" (UID: \"67c536d1-256d-465b-8dbb-23442620b388\") " pod="kuadrant-system/authorino-operator-657f44b778-8rj49" Apr 20 20:19:07.754710 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:07.754691 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dhgl\" (UniqueName: \"kubernetes.io/projected/67c536d1-256d-465b-8dbb-23442620b388-kube-api-access-6dhgl\") pod \"authorino-operator-657f44b778-8rj49\" (UID: \"67c536d1-256d-465b-8dbb-23442620b388\") " pod="kuadrant-system/authorino-operator-657f44b778-8rj49" Apr 20 20:19:07.845815 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:07.845786 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-4zwq7\"" Apr 20 20:19:07.853823 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:07.853807 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-8rj49" Apr 20 20:19:07.973707 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:07.973689 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-8rj49"] Apr 20 20:19:07.975912 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:19:07.975881 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67c536d1_256d_465b_8dbb_23442620b388.slice/crio-60f01ac6c5e39f048247c373a14fdc8bf3284003bda6e48595eb3ed800383f47 WatchSource:0}: Error finding container 60f01ac6c5e39f048247c373a14fdc8bf3284003bda6e48595eb3ed800383f47: Status 404 returned error can't find the container with id 60f01ac6c5e39f048247c373a14fdc8bf3284003bda6e48595eb3ed800383f47 Apr 20 20:19:08.434120 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:08.434092 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-8rj49" event={"ID":"67c536d1-256d-465b-8dbb-23442620b388","Type":"ContainerStarted","Data":"60f01ac6c5e39f048247c373a14fdc8bf3284003bda6e48595eb3ed800383f47"} Apr 20 20:19:10.443442 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:10.443413 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-8rj49" event={"ID":"67c536d1-256d-465b-8dbb-23442620b388","Type":"ContainerStarted","Data":"e923fd55a510df344c4f476de6b104c88df24a342d8b0601b90d7a88e8814e8b"} Apr 20 20:19:10.443798 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:10.443596 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-8rj49" Apr 20 20:19:10.459661 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:10.459622 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-8rj49" podStartSLOduration=1.308269149 podStartE2EDuration="3.459610519s" podCreationTimestamp="2026-04-20 20:19:07 +0000 UTC" firstStartedPulling="2026-04-20 20:19:07.977942113 +0000 UTC m=+480.656209148" lastFinishedPulling="2026-04-20 20:19:10.129283481 +0000 UTC m=+482.807550518" observedRunningTime="2026-04-20 20:19:10.457986387 +0000 UTC m=+483.136253444" watchObservedRunningTime="2026-04-20 20:19:10.459610519 +0000 UTC m=+483.137877573" Apr 20 20:19:13.417600 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:13.417537 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" Apr 20 20:19:21.450147 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:21.450119 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-8rj49" Apr 20 20:19:25.242864 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.242831 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft"] Apr 20 20:19:25.243294 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.243148 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" podUID="20b5a918-5d41-4237-9898-3217e6f8c595" containerName="manager" containerID="cri-o://25fdccaac3a7c866f2b16062988c3ff7735fa67104c2ecb6f155215a898b9ba2" gracePeriod=2 Apr 20 20:19:25.249938 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.249910 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft"] Apr 20 20:19:25.274256 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.274222 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zzg4b"] Apr 20 20:19:25.274616 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.274601 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20b5a918-5d41-4237-9898-3217e6f8c595" containerName="manager" Apr 20 20:19:25.274616 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.274618 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b5a918-5d41-4237-9898-3217e6f8c595" containerName="manager" Apr 20 20:19:25.274723 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.274684 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="20b5a918-5d41-4237-9898-3217e6f8c595" containerName="manager" Apr 20 20:19:25.276678 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.276661 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zzg4b" Apr 20 20:19:25.279042 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.279014 2578 status_manager.go:895] "Failed to get status for pod" podUID="20b5a918-5d41-4237-9898-3217e6f8c595" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" err="pods \"limitador-operator-controller-manager-85c4996f8c-5dpft\" is forbidden: User \"system:node:ip-10-0-134-66.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-66.ec2.internal' and this object" Apr 20 20:19:25.287544 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.287519 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zzg4b"] Apr 20 20:19:25.382790 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.382756 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxsfh\" (UniqueName: \"kubernetes.io/projected/b2734f44-c040-42b8-bd74-c8526ca495cc-kube-api-access-nxsfh\") pod \"limitador-operator-controller-manager-85c4996f8c-zzg4b\" (UID: \"b2734f44-c040-42b8-bd74-c8526ca495cc\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zzg4b" Apr 20 20:19:25.476707 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.476687 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" Apr 20 20:19:25.479458 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.479429 2578 status_manager.go:895] "Failed to get status for pod" podUID="20b5a918-5d41-4237-9898-3217e6f8c595" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" err="pods \"limitador-operator-controller-manager-85c4996f8c-5dpft\" is forbidden: User \"system:node:ip-10-0-134-66.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-66.ec2.internal' and this object" Apr 20 20:19:25.483809 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.483787 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxsfh\" (UniqueName: \"kubernetes.io/projected/b2734f44-c040-42b8-bd74-c8526ca495cc-kube-api-access-nxsfh\") pod \"limitador-operator-controller-manager-85c4996f8c-zzg4b\" (UID: \"b2734f44-c040-42b8-bd74-c8526ca495cc\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zzg4b" Apr 20 20:19:25.498489 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.498436 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxsfh\" (UniqueName: \"kubernetes.io/projected/b2734f44-c040-42b8-bd74-c8526ca495cc-kube-api-access-nxsfh\") pod \"limitador-operator-controller-manager-85c4996f8c-zzg4b\" (UID: \"b2734f44-c040-42b8-bd74-c8526ca495cc\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zzg4b" Apr 20 20:19:25.499570 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.499547 2578 generic.go:358] "Generic (PLEG): container finished" podID="20b5a918-5d41-4237-9898-3217e6f8c595" containerID="25fdccaac3a7c866f2b16062988c3ff7735fa67104c2ecb6f155215a898b9ba2" exitCode=0 Apr 20 20:19:25.499683 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.499622 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" Apr 20 20:19:25.499683 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.499636 2578 scope.go:117] "RemoveContainer" containerID="25fdccaac3a7c866f2b16062988c3ff7735fa67104c2ecb6f155215a898b9ba2" Apr 20 20:19:25.502710 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.502686 2578 status_manager.go:895] "Failed to get status for pod" podUID="20b5a918-5d41-4237-9898-3217e6f8c595" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" err="pods \"limitador-operator-controller-manager-85c4996f8c-5dpft\" is forbidden: User \"system:node:ip-10-0-134-66.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-66.ec2.internal' and this object" Apr 20 20:19:25.508168 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.508151 2578 scope.go:117] "RemoveContainer" containerID="25fdccaac3a7c866f2b16062988c3ff7735fa67104c2ecb6f155215a898b9ba2" Apr 20 20:19:25.508410 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:19:25.508382 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25fdccaac3a7c866f2b16062988c3ff7735fa67104c2ecb6f155215a898b9ba2\": container with ID starting with 25fdccaac3a7c866f2b16062988c3ff7735fa67104c2ecb6f155215a898b9ba2 not found: ID does not exist" containerID="25fdccaac3a7c866f2b16062988c3ff7735fa67104c2ecb6f155215a898b9ba2" Apr 20 20:19:25.508466 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.508412 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25fdccaac3a7c866f2b16062988c3ff7735fa67104c2ecb6f155215a898b9ba2"} err="failed to get container status \"25fdccaac3a7c866f2b16062988c3ff7735fa67104c2ecb6f155215a898b9ba2\": rpc error: code = NotFound desc = could not find container \"25fdccaac3a7c866f2b16062988c3ff7735fa67104c2ecb6f155215a898b9ba2\": container with ID starting with 25fdccaac3a7c866f2b16062988c3ff7735fa67104c2ecb6f155215a898b9ba2 not found: ID does not exist" Apr 20 20:19:25.584754 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.584733 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btwrj\" (UniqueName: \"kubernetes.io/projected/20b5a918-5d41-4237-9898-3217e6f8c595-kube-api-access-btwrj\") pod \"20b5a918-5d41-4237-9898-3217e6f8c595\" (UID: \"20b5a918-5d41-4237-9898-3217e6f8c595\") " Apr 20 20:19:25.586661 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.586631 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b5a918-5d41-4237-9898-3217e6f8c595-kube-api-access-btwrj" (OuterVolumeSpecName: "kube-api-access-btwrj") pod "20b5a918-5d41-4237-9898-3217e6f8c595" (UID: "20b5a918-5d41-4237-9898-3217e6f8c595"). InnerVolumeSpecName "kube-api-access-btwrj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:19:25.635761 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.635740 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zzg4b" Apr 20 20:19:25.685820 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.685776 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-btwrj\" (UniqueName: \"kubernetes.io/projected/20b5a918-5d41-4237-9898-3217e6f8c595-kube-api-access-btwrj\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:19:25.757778 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.757753 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zzg4b"] Apr 20 20:19:25.760107 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:19:25.760073 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2734f44_c040_42b8_bd74_c8526ca495cc.slice/crio-b4f4db3f616d416e573ceef74ec970e5db61bd1aaa23cc194b8b470c08428b39 WatchSource:0}: Error finding container b4f4db3f616d416e573ceef74ec970e5db61bd1aaa23cc194b8b470c08428b39: Status 404 returned error can't find the container with id b4f4db3f616d416e573ceef74ec970e5db61bd1aaa23cc194b8b470c08428b39 Apr 20 20:19:25.809436 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.809411 2578 status_manager.go:895] "Failed to get status for pod" podUID="20b5a918-5d41-4237-9898-3217e6f8c595" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5dpft" err="pods \"limitador-operator-controller-manager-85c4996f8c-5dpft\" is forbidden: User \"system:node:ip-10-0-134-66.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-66.ec2.internal' and this object" Apr 20 20:19:25.890514 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:25.890489 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b5a918-5d41-4237-9898-3217e6f8c595" path="/var/lib/kubelet/pods/20b5a918-5d41-4237-9898-3217e6f8c595/volumes" Apr 20 20:19:26.505423 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:26.505389 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zzg4b" event={"ID":"b2734f44-c040-42b8-bd74-c8526ca495cc","Type":"ContainerStarted","Data":"f57bf48005bf091edfc8694e4753ceda57b1ca5d5dd7d98216f41b06be563f90"} Apr 20 20:19:26.505423 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:26.505425 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zzg4b" event={"ID":"b2734f44-c040-42b8-bd74-c8526ca495cc","Type":"ContainerStarted","Data":"b4f4db3f616d416e573ceef74ec970e5db61bd1aaa23cc194b8b470c08428b39"} Apr 20 20:19:26.505937 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:26.505518 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zzg4b" Apr 20 20:19:26.522886 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:26.522840 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zzg4b" podStartSLOduration=1.5228242939999999 podStartE2EDuration="1.522824294s" podCreationTimestamp="2026-04-20 20:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:19:26.52047711 +0000 UTC m=+499.198744169" watchObservedRunningTime="2026-04-20 20:19:26.522824294 +0000 UTC m=+499.201091350" Apr 20 20:19:37.511279 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:19:37.511241 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zzg4b" Apr 20 20:20:10.528710 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:10.528678 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:20:10.531756 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:10.531736 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-p5xxh" Apr 20 20:20:10.534235 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:10.534212 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 20:20:10.534321 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:10.534253 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9hfnq\"" Apr 20 20:20:10.537330 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:10.537310 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:20:10.563812 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:10.563790 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:20:10.597461 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:10.597434 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-546wn\" (UniqueName: \"kubernetes.io/projected/5e661ad4-502f-4dd7-adec-3dea8dd4f72e-kube-api-access-546wn\") pod \"limitador-limitador-78c99df468-p5xxh\" (UID: \"5e661ad4-502f-4dd7-adec-3dea8dd4f72e\") " pod="kuadrant-system/limitador-limitador-78c99df468-p5xxh" Apr 20 20:20:10.597608 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:10.597499 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5e661ad4-502f-4dd7-adec-3dea8dd4f72e-config-file\") pod \"limitador-limitador-78c99df468-p5xxh\" (UID: \"5e661ad4-502f-4dd7-adec-3dea8dd4f72e\") " pod="kuadrant-system/limitador-limitador-78c99df468-p5xxh" Apr 20 20:20:10.698154 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:10.698131 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5e661ad4-502f-4dd7-adec-3dea8dd4f72e-config-file\") pod \"limitador-limitador-78c99df468-p5xxh\" (UID: \"5e661ad4-502f-4dd7-adec-3dea8dd4f72e\") " pod="kuadrant-system/limitador-limitador-78c99df468-p5xxh" Apr 20 20:20:10.698310 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:10.698194 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-546wn\" (UniqueName: \"kubernetes.io/projected/5e661ad4-502f-4dd7-adec-3dea8dd4f72e-kube-api-access-546wn\") pod \"limitador-limitador-78c99df468-p5xxh\" (UID: \"5e661ad4-502f-4dd7-adec-3dea8dd4f72e\") " pod="kuadrant-system/limitador-limitador-78c99df468-p5xxh" Apr 20 20:20:10.698774 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:10.698752 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5e661ad4-502f-4dd7-adec-3dea8dd4f72e-config-file\") pod \"limitador-limitador-78c99df468-p5xxh\" (UID: \"5e661ad4-502f-4dd7-adec-3dea8dd4f72e\") " pod="kuadrant-system/limitador-limitador-78c99df468-p5xxh" Apr 20 20:20:10.705820 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:10.705789 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-546wn\" (UniqueName: \"kubernetes.io/projected/5e661ad4-502f-4dd7-adec-3dea8dd4f72e-kube-api-access-546wn\") pod \"limitador-limitador-78c99df468-p5xxh\" (UID: \"5e661ad4-502f-4dd7-adec-3dea8dd4f72e\") " pod="kuadrant-system/limitador-limitador-78c99df468-p5xxh" Apr 20 20:20:10.843071 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:10.843015 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-p5xxh" Apr 20 20:20:10.964570 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:10.964546 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:20:10.966407 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:20:10.966373 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e661ad4_502f_4dd7_adec_3dea8dd4f72e.slice/crio-ce05cf1d330e21058655440f9a714a2b7cb800b481f87ad6c316c22ea52188d7 WatchSource:0}: Error finding container ce05cf1d330e21058655440f9a714a2b7cb800b481f87ad6c316c22ea52188d7: Status 404 returned error can't find the container with id ce05cf1d330e21058655440f9a714a2b7cb800b481f87ad6c316c22ea52188d7 Apr 20 20:20:11.660767 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:11.660710 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-p5xxh" event={"ID":"5e661ad4-502f-4dd7-adec-3dea8dd4f72e","Type":"ContainerStarted","Data":"ce05cf1d330e21058655440f9a714a2b7cb800b481f87ad6c316c22ea52188d7"} Apr 20 20:20:13.668415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:13.668389 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-p5xxh" event={"ID":"5e661ad4-502f-4dd7-adec-3dea8dd4f72e","Type":"ContainerStarted","Data":"3cf0449fefba41a6a4e2949562388f8fc0ad3763012e7cadc0b1e04e6f6ec0c1"} Apr 20 20:20:13.668778 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:13.668527 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-p5xxh" Apr 20 20:20:13.688052 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:13.687992 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-p5xxh" podStartSLOduration=1.152954306 podStartE2EDuration="3.68798018s" podCreationTimestamp="2026-04-20 20:20:10 +0000 UTC" firstStartedPulling="2026-04-20 20:20:10.968776409 +0000 UTC m=+543.647043444" lastFinishedPulling="2026-04-20 20:20:13.503802272 +0000 UTC m=+546.182069318" observedRunningTime="2026-04-20 20:20:13.687874339 +0000 UTC m=+546.366141396" watchObservedRunningTime="2026-04-20 20:20:13.68798018 +0000 UTC m=+546.366247237" Apr 20 20:20:24.673446 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:20:24.673416 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-p5xxh" Apr 20 20:21:07.370807 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:21:07.370769 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:21:07.785873 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:21:07.785847 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:21:07.787901 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:21:07.787878 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:21:32.461131 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:21:32.460923 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:21:36.257369 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:21:36.257335 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:21:50.456052 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:21:50.456022 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:22:02.448554 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:22:02.448524 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:22:07.055152 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:22:07.055113 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:22:59.964640 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:22:59.964545 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:23:53.456791 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:23:53.456752 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:24:04.652092 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:24:04.652056 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:24:13.050909 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:24:13.050879 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:24:22.953027 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:24:22.952992 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:24:31.952705 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:24:31.952635 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:24:42.563245 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:24:42.563205 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:25:43.357177 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:25:43.357142 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:25:59.357131 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:25:59.357039 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:26:07.812457 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:26:07.812423 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:26:07.815276 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:26:07.815249 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:26:37.913980 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:26:37.913948 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:26:54.057171 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:26:54.057132 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:27:09.594456 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:27:09.594422 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:27:25.851810 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:27:25.851737 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:27:29.761659 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:27:29.761623 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:27:53.353982 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:27:53.353945 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:27:57.653893 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:27:57.653860 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:28:20.450278 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:28:20.450231 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:28:29.466848 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:28:29.466813 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:28:46.050383 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:28:46.050347 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:28:53.857057 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:28:53.857018 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:29:11.260442 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:29:11.260365 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:29:18.755785 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:29:18.755753 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:29:52.155596 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:29:52.155544 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:30:00.143804 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:00.143771 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29611950-v754q"] Apr 20 20:30:00.146979 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:00.146962 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611950-v754q" Apr 20 20:30:00.149958 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:00.149939 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-qp9nz\"" Apr 20 20:30:00.166624 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:00.166602 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611950-v754q"] Apr 20 20:30:00.275869 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:00.275836 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t9rt\" (UniqueName: \"kubernetes.io/projected/d1d1b1a1-feb7-437f-96a2-a465548c60d7-kube-api-access-5t9rt\") pod \"maas-api-key-cleanup-29611950-v754q\" (UID: \"d1d1b1a1-feb7-437f-96a2-a465548c60d7\") " pod="opendatahub/maas-api-key-cleanup-29611950-v754q" Apr 20 20:30:00.376639 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:00.376605 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5t9rt\" (UniqueName: \"kubernetes.io/projected/d1d1b1a1-feb7-437f-96a2-a465548c60d7-kube-api-access-5t9rt\") pod \"maas-api-key-cleanup-29611950-v754q\" (UID: \"d1d1b1a1-feb7-437f-96a2-a465548c60d7\") " pod="opendatahub/maas-api-key-cleanup-29611950-v754q" Apr 20 20:30:00.385506 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:00.385483 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t9rt\" (UniqueName: \"kubernetes.io/projected/d1d1b1a1-feb7-437f-96a2-a465548c60d7-kube-api-access-5t9rt\") pod \"maas-api-key-cleanup-29611950-v754q\" (UID: \"d1d1b1a1-feb7-437f-96a2-a465548c60d7\") " pod="opendatahub/maas-api-key-cleanup-29611950-v754q" Apr 20 20:30:00.451957 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:00.451921 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:30:00.457220 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:00.457201 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611950-v754q" Apr 20 20:30:00.582112 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:00.582081 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611950-v754q"] Apr 20 20:30:00.585542 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:30:00.585511 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1d1b1a1_feb7_437f_96a2_a465548c60d7.slice/crio-0906e8955d9a5995216c8b14c81862e71503286163bbedb1a27ede329dd69064 WatchSource:0}: Error finding container 0906e8955d9a5995216c8b14c81862e71503286163bbedb1a27ede329dd69064: Status 404 returned error can't find the container with id 0906e8955d9a5995216c8b14c81862e71503286163bbedb1a27ede329dd69064 Apr 20 20:30:00.587856 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:00.587837 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:30:00.631390 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:00.631356 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611950-v754q" event={"ID":"d1d1b1a1-feb7-437f-96a2-a465548c60d7","Type":"ContainerStarted","Data":"0906e8955d9a5995216c8b14c81862e71503286163bbedb1a27ede329dd69064"} Apr 20 20:30:03.643111 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:03.643075 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611950-v754q" event={"ID":"d1d1b1a1-feb7-437f-96a2-a465548c60d7","Type":"ContainerStarted","Data":"8eca9502c6cd7cd5211b1a6933088b25b1e89eb393089ff3a38ef15a117f4e29"} Apr 20 20:30:03.660792 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:03.660742 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29611950-v754q" podStartSLOduration=1.520296868 podStartE2EDuration="3.660729903s" podCreationTimestamp="2026-04-20 20:30:00 +0000 UTC" firstStartedPulling="2026-04-20 20:30:00.588036866 +0000 UTC m=+1133.266303906" lastFinishedPulling="2026-04-20 20:30:02.728469894 +0000 UTC m=+1135.406736941" observedRunningTime="2026-04-20 20:30:03.65913466 +0000 UTC m=+1136.337401716" watchObservedRunningTime="2026-04-20 20:30:03.660729903 +0000 UTC m=+1136.338997013" Apr 20 20:30:04.061305 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:04.061269 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:30:09.856968 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:09.856930 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:30:17.256469 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:17.256428 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:30:23.712733 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:23.712698 2578 generic.go:358] "Generic (PLEG): container finished" podID="d1d1b1a1-feb7-437f-96a2-a465548c60d7" containerID="8eca9502c6cd7cd5211b1a6933088b25b1e89eb393089ff3a38ef15a117f4e29" exitCode=6 Apr 20 20:30:23.713119 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:23.712745 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611950-v754q" event={"ID":"d1d1b1a1-feb7-437f-96a2-a465548c60d7","Type":"ContainerDied","Data":"8eca9502c6cd7cd5211b1a6933088b25b1e89eb393089ff3a38ef15a117f4e29"} Apr 20 20:30:23.713119 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:23.713014 2578 scope.go:117] "RemoveContainer" containerID="8eca9502c6cd7cd5211b1a6933088b25b1e89eb393089ff3a38ef15a117f4e29" Apr 20 20:30:24.717286 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:24.717208 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611950-v754q" event={"ID":"d1d1b1a1-feb7-437f-96a2-a465548c60d7","Type":"ContainerStarted","Data":"48bfc7233a98cf711e84fe947b826efce5a590e42c51ca6fe3afb7e57fd31ab1"} Apr 20 20:30:25.653553 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:25.653506 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:30:42.665962 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:42.665921 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:30:44.784984 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:44.784948 2578 generic.go:358] "Generic (PLEG): container finished" podID="d1d1b1a1-feb7-437f-96a2-a465548c60d7" containerID="48bfc7233a98cf711e84fe947b826efce5a590e42c51ca6fe3afb7e57fd31ab1" exitCode=6 Apr 20 20:30:44.785388 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:44.784996 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611950-v754q" event={"ID":"d1d1b1a1-feb7-437f-96a2-a465548c60d7","Type":"ContainerDied","Data":"48bfc7233a98cf711e84fe947b826efce5a590e42c51ca6fe3afb7e57fd31ab1"} Apr 20 20:30:44.785388 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:44.785030 2578 scope.go:117] "RemoveContainer" containerID="8eca9502c6cd7cd5211b1a6933088b25b1e89eb393089ff3a38ef15a117f4e29" Apr 20 20:30:44.785388 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:44.785336 2578 scope.go:117] "RemoveContainer" containerID="48bfc7233a98cf711e84fe947b826efce5a590e42c51ca6fe3afb7e57fd31ab1" Apr 20 20:30:44.785618 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:30:44.785571 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29611950-v754q_opendatahub(d1d1b1a1-feb7-437f-96a2-a465548c60d7)\"" pod="opendatahub/maas-api-key-cleanup-29611950-v754q" podUID="d1d1b1a1-feb7-437f-96a2-a465548c60d7" Apr 20 20:30:55.553989 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:55.553952 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:30:56.886623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:56.886592 2578 scope.go:117] "RemoveContainer" containerID="48bfc7233a98cf711e84fe947b826efce5a590e42c51ca6fe3afb7e57fd31ab1" Apr 20 20:30:57.834213 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:57.834182 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611950-v754q" event={"ID":"d1d1b1a1-feb7-437f-96a2-a465548c60d7","Type":"ContainerStarted","Data":"f458c8b98ca62bdfe45d2d4c224f7e6065fbcad34a1a08e65435cf1a5eeab45a"} Apr 20 20:30:57.910862 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:57.910829 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611950-v754q"] Apr 20 20:30:58.837222 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:30:58.837184 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29611950-v754q" podUID="d1d1b1a1-feb7-437f-96a2-a465548c60d7" containerName="cleanup" containerID="cri-o://f458c8b98ca62bdfe45d2d4c224f7e6065fbcad34a1a08e65435cf1a5eeab45a" gracePeriod=30 Apr 20 20:31:07.837284 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:07.837254 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:31:07.840250 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:07.840233 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:31:17.688875 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.688854 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611950-v754q" Apr 20 20:31:17.704151 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.704126 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t9rt\" (UniqueName: \"kubernetes.io/projected/d1d1b1a1-feb7-437f-96a2-a465548c60d7-kube-api-access-5t9rt\") pod \"d1d1b1a1-feb7-437f-96a2-a465548c60d7\" (UID: \"d1d1b1a1-feb7-437f-96a2-a465548c60d7\") " Apr 20 20:31:17.706701 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.706667 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1d1b1a1-feb7-437f-96a2-a465548c60d7-kube-api-access-5t9rt" (OuterVolumeSpecName: "kube-api-access-5t9rt") pod "d1d1b1a1-feb7-437f-96a2-a465548c60d7" (UID: "d1d1b1a1-feb7-437f-96a2-a465548c60d7"). InnerVolumeSpecName "kube-api-access-5t9rt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:31:17.805072 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.805006 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5t9rt\" (UniqueName: \"kubernetes.io/projected/d1d1b1a1-feb7-437f-96a2-a465548c60d7-kube-api-access-5t9rt\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:31:17.907814 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.907787 2578 generic.go:358] "Generic (PLEG): container finished" podID="d1d1b1a1-feb7-437f-96a2-a465548c60d7" containerID="f458c8b98ca62bdfe45d2d4c224f7e6065fbcad34a1a08e65435cf1a5eeab45a" exitCode=6 Apr 20 20:31:17.907933 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.907857 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611950-v754q" Apr 20 20:31:17.907933 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.907869 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611950-v754q" event={"ID":"d1d1b1a1-feb7-437f-96a2-a465548c60d7","Type":"ContainerDied","Data":"f458c8b98ca62bdfe45d2d4c224f7e6065fbcad34a1a08e65435cf1a5eeab45a"} Apr 20 20:31:17.907933 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.907907 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611950-v754q" event={"ID":"d1d1b1a1-feb7-437f-96a2-a465548c60d7","Type":"ContainerDied","Data":"0906e8955d9a5995216c8b14c81862e71503286163bbedb1a27ede329dd69064"} Apr 20 20:31:17.907933 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.907923 2578 scope.go:117] "RemoveContainer" containerID="f458c8b98ca62bdfe45d2d4c224f7e6065fbcad34a1a08e65435cf1a5eeab45a" Apr 20 20:31:17.915804 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.915791 2578 scope.go:117] "RemoveContainer" containerID="48bfc7233a98cf711e84fe947b826efce5a590e42c51ca6fe3afb7e57fd31ab1" Apr 20 20:31:17.923005 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.922987 2578 scope.go:117] "RemoveContainer" containerID="f458c8b98ca62bdfe45d2d4c224f7e6065fbcad34a1a08e65435cf1a5eeab45a" Apr 20 20:31:17.923139 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.923083 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611950-v754q"] Apr 20 20:31:17.923244 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:31:17.923227 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f458c8b98ca62bdfe45d2d4c224f7e6065fbcad34a1a08e65435cf1a5eeab45a\": container with ID starting with f458c8b98ca62bdfe45d2d4c224f7e6065fbcad34a1a08e65435cf1a5eeab45a not found: ID does not exist" containerID="f458c8b98ca62bdfe45d2d4c224f7e6065fbcad34a1a08e65435cf1a5eeab45a" Apr 20 20:31:17.923286 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.923254 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f458c8b98ca62bdfe45d2d4c224f7e6065fbcad34a1a08e65435cf1a5eeab45a"} err="failed to get container status \"f458c8b98ca62bdfe45d2d4c224f7e6065fbcad34a1a08e65435cf1a5eeab45a\": rpc error: code = NotFound desc = could not find container \"f458c8b98ca62bdfe45d2d4c224f7e6065fbcad34a1a08e65435cf1a5eeab45a\": container with ID starting with f458c8b98ca62bdfe45d2d4c224f7e6065fbcad34a1a08e65435cf1a5eeab45a not found: ID does not exist" Apr 20 20:31:17.923286 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.923272 2578 scope.go:117] "RemoveContainer" containerID="48bfc7233a98cf711e84fe947b826efce5a590e42c51ca6fe3afb7e57fd31ab1" Apr 20 20:31:17.923490 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:31:17.923475 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48bfc7233a98cf711e84fe947b826efce5a590e42c51ca6fe3afb7e57fd31ab1\": container with ID starting with 48bfc7233a98cf711e84fe947b826efce5a590e42c51ca6fe3afb7e57fd31ab1 not found: ID does not exist" containerID="48bfc7233a98cf711e84fe947b826efce5a590e42c51ca6fe3afb7e57fd31ab1" Apr 20 20:31:17.923539 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.923494 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48bfc7233a98cf711e84fe947b826efce5a590e42c51ca6fe3afb7e57fd31ab1"} err="failed to get container status \"48bfc7233a98cf711e84fe947b826efce5a590e42c51ca6fe3afb7e57fd31ab1\": rpc error: code = NotFound desc = could not find container \"48bfc7233a98cf711e84fe947b826efce5a590e42c51ca6fe3afb7e57fd31ab1\": container with ID starting with 48bfc7233a98cf711e84fe947b826efce5a590e42c51ca6fe3afb7e57fd31ab1 not found: ID does not exist" Apr 20 20:31:17.928550 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:17.928528 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611950-v754q"] Apr 20 20:31:19.891308 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:19.891272 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1d1b1a1-feb7-437f-96a2-a465548c60d7" path="/var/lib/kubelet/pods/d1d1b1a1-feb7-437f-96a2-a465548c60d7/volumes" Apr 20 20:31:41.761458 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:41.761425 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:31:50.956090 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:31:50.956049 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:32:00.059036 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:32:00.058955 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:32:08.052321 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:32:08.052289 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:32:17.066319 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:32:17.066283 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:32:26.657493 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:32:26.657456 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:32:34.352645 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:32:34.352611 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:32:39.649887 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:32:39.649842 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:32:43.453745 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:32:43.453701 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:32:51.261650 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:32:51.261615 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:33:00.355399 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:33:00.355370 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:33:09.056911 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:33:09.056875 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:33:18.353215 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:33:18.353173 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:33:27.160733 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:33:27.160644 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:33:34.954381 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:33:34.954350 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:33:43.957153 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:33:43.957118 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:33:52.555228 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:33:52.555188 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:34:01.455671 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:34:01.455635 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:34:09.665268 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:34:09.665230 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:36:07.863121 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:36:07.863091 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:36:07.869069 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:36:07.869049 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:36:28.156743 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:36:28.156662 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:36:32.857393 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:36:32.857360 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:36:58.566766 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:36:58.566729 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:37:05.063281 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:37:05.063241 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:37:13.967427 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:37:13.967394 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:37:24.556393 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:37:24.556357 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:37:34.192262 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:37:34.192221 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:37:44.860867 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:37:44.860829 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:37:52.853545 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:37:52.853498 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:38:03.076664 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:38:03.076569 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:38:12.852111 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:38:12.852079 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:38:23.457845 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:38:23.457809 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:38:32.660021 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:38:32.659988 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:39:07.256952 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:39:07.256914 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:39:50.363973 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:39:50.363884 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:39:58.457994 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:39:58.457959 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:40:06.456461 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:40:06.456429 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:40:15.456404 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:40:15.456370 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:40:23.668182 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:40:23.668139 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:40:34.564822 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:40:34.564789 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:40:43.051998 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:40:43.051965 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:40:51.961230 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:40:51.961194 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:41:00.607445 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:41:00.607370 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:41:07.897148 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:41:07.897116 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:41:07.903862 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:41:07.903840 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:41:08.259620 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:41:08.259565 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:41:16.562623 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:41:16.562590 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:41:27.951993 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:41:27.951963 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:41:45.276989 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:41:45.276950 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:41:53.560436 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:41:53.560398 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:42:02.563253 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:42:02.563216 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:42:10.754634 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:42:10.754571 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:42:27.457099 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:42:27.457015 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:42:35.257136 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:42:35.257094 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:42:44.853999 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:42:44.853963 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:42:52.870427 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:42:52.870397 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:43:02.158735 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:43:02.158706 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:43:10.553635 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:43:10.553600 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:43:19.459252 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:43:19.459220 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:43:32.554108 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:43:32.554065 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:43:41.357545 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:43:41.357507 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:43:52.556417 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:43:52.556381 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:44:01.455927 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:44:01.455848 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:44:08.353720 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:44:08.353684 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:44:19.056133 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:44:19.056098 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:44:25.555940 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:44:25.555905 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:44:43.252548 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:44:43.252517 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:44:51.458117 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:44:51.458081 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:44:59.460925 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:44:59.460889 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:45:00.134343 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.134304 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29611965-rr5l4"] Apr 20 20:45:00.134734 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.134717 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1d1b1a1-feb7-437f-96a2-a465548c60d7" containerName="cleanup" Apr 20 20:45:00.134831 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.134736 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d1b1a1-feb7-437f-96a2-a465548c60d7" containerName="cleanup" Apr 20 20:45:00.134831 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.134762 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1d1b1a1-feb7-437f-96a2-a465548c60d7" containerName="cleanup" Apr 20 20:45:00.134831 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.134770 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d1b1a1-feb7-437f-96a2-a465548c60d7" containerName="cleanup" Apr 20 20:45:00.134987 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.134851 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1d1b1a1-feb7-437f-96a2-a465548c60d7" containerName="cleanup" Apr 20 20:45:00.134987 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.134862 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1d1b1a1-feb7-437f-96a2-a465548c60d7" containerName="cleanup" Apr 20 20:45:00.134987 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.134877 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1d1b1a1-feb7-437f-96a2-a465548c60d7" containerName="cleanup" Apr 20 20:45:00.138037 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.138017 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" Apr 20 20:45:00.140471 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.140447 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-qp9nz\"" Apr 20 20:45:00.155434 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.155409 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611965-rr5l4"] Apr 20 20:45:00.274540 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.274513 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gdj9\" (UniqueName: \"kubernetes.io/projected/91ea93fa-fb3c-497b-9a9e-93be311bf366-kube-api-access-2gdj9\") pod \"maas-api-key-cleanup-29611965-rr5l4\" (UID: \"91ea93fa-fb3c-497b-9a9e-93be311bf366\") " pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" Apr 20 20:45:00.375294 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.375259 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gdj9\" (UniqueName: \"kubernetes.io/projected/91ea93fa-fb3c-497b-9a9e-93be311bf366-kube-api-access-2gdj9\") pod \"maas-api-key-cleanup-29611965-rr5l4\" (UID: \"91ea93fa-fb3c-497b-9a9e-93be311bf366\") " pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" Apr 20 20:45:00.383972 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.383943 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gdj9\" (UniqueName: \"kubernetes.io/projected/91ea93fa-fb3c-497b-9a9e-93be311bf366-kube-api-access-2gdj9\") pod \"maas-api-key-cleanup-29611965-rr5l4\" (UID: \"91ea93fa-fb3c-497b-9a9e-93be311bf366\") " pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" Apr 20 20:45:00.447667 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.447638 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" Apr 20 20:45:00.568533 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.568509 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611965-rr5l4"] Apr 20 20:45:00.570627 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:45:00.570602 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91ea93fa_fb3c_497b_9a9e_93be311bf366.slice/crio-d9586c0e220b737bb9b820652365acf159ee81829e05bf3e061456cea9a60c78 WatchSource:0}: Error finding container d9586c0e220b737bb9b820652365acf159ee81829e05bf3e061456cea9a60c78: Status 404 returned error can't find the container with id d9586c0e220b737bb9b820652365acf159ee81829e05bf3e061456cea9a60c78 Apr 20 20:45:00.578047 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.577012 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:45:00.603602 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:00.603550 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" event={"ID":"91ea93fa-fb3c-497b-9a9e-93be311bf366","Type":"ContainerStarted","Data":"d9586c0e220b737bb9b820652365acf159ee81829e05bf3e061456cea9a60c78"} Apr 20 20:45:01.608071 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:01.608038 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" event={"ID":"91ea93fa-fb3c-497b-9a9e-93be311bf366","Type":"ContainerStarted","Data":"2564a6db731a9dead690bb2537cf384400cc584c139b8d3067dcd1ba9aa0de86"} Apr 20 20:45:01.623352 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:01.623311 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" podStartSLOduration=1.623297966 podStartE2EDuration="1.623297966s" podCreationTimestamp="2026-04-20 20:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:45:01.622408984 +0000 UTC m=+2034.300676041" watchObservedRunningTime="2026-04-20 20:45:01.623297966 +0000 UTC m=+2034.301565024" Apr 20 20:45:06.661423 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:06.661385 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:45:21.673635 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:21.673572 2578 generic.go:358] "Generic (PLEG): container finished" podID="91ea93fa-fb3c-497b-9a9e-93be311bf366" containerID="2564a6db731a9dead690bb2537cf384400cc584c139b8d3067dcd1ba9aa0de86" exitCode=6 Apr 20 20:45:21.674042 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:21.673656 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" event={"ID":"91ea93fa-fb3c-497b-9a9e-93be311bf366","Type":"ContainerDied","Data":"2564a6db731a9dead690bb2537cf384400cc584c139b8d3067dcd1ba9aa0de86"} Apr 20 20:45:21.674042 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:21.673992 2578 scope.go:117] "RemoveContainer" containerID="2564a6db731a9dead690bb2537cf384400cc584c139b8d3067dcd1ba9aa0de86" Apr 20 20:45:22.678401 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:22.678365 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" event={"ID":"91ea93fa-fb3c-497b-9a9e-93be311bf366","Type":"ContainerStarted","Data":"58bdbca88d900e46ee3888c58d6cbd9c94dd27adfc840bf55092e02ae03fdd31"} Apr 20 20:45:31.957193 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:31.957111 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:45:42.751415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:42.751382 2578 generic.go:358] "Generic (PLEG): container finished" podID="91ea93fa-fb3c-497b-9a9e-93be311bf366" containerID="58bdbca88d900e46ee3888c58d6cbd9c94dd27adfc840bf55092e02ae03fdd31" exitCode=6 Apr 20 20:45:42.751842 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:42.751459 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" event={"ID":"91ea93fa-fb3c-497b-9a9e-93be311bf366","Type":"ContainerDied","Data":"58bdbca88d900e46ee3888c58d6cbd9c94dd27adfc840bf55092e02ae03fdd31"} Apr 20 20:45:42.751842 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:42.751504 2578 scope.go:117] "RemoveContainer" containerID="2564a6db731a9dead690bb2537cf384400cc584c139b8d3067dcd1ba9aa0de86" Apr 20 20:45:42.751965 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:42.751875 2578 scope.go:117] "RemoveContainer" containerID="58bdbca88d900e46ee3888c58d6cbd9c94dd27adfc840bf55092e02ae03fdd31" Apr 20 20:45:42.752123 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:45:42.752095 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29611965-rr5l4_opendatahub(91ea93fa-fb3c-497b-9a9e-93be311bf366)\"" pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" podUID="91ea93fa-fb3c-497b-9a9e-93be311bf366" Apr 20 20:45:44.261504 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:44.261471 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-p5xxh"] Apr 20 20:45:49.788801 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:49.788774 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29611965-rr5l4_91ea93fa-fb3c-497b-9a9e-93be311bf366/cleanup/1.log" Apr 20 20:45:50.128076 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:50.127997 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7f7bf89c4-747pn_6be6a96a-5585-4d1d-b02b-8ab01176b90b/manager/0.log" Apr 20 20:45:51.814385 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:51.814353 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-8rj49_67c536d1-256d-465b-8dbb-23442620b388/manager/0.log" Apr 20 20:45:52.128455 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:52.128386 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-64ngl_599f74c2-9421-424b-9b12-072566336497/registry-server/0.log" Apr 20 20:45:52.341841 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:52.341811 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-p5xxh_5e661ad4-502f-4dd7-adec-3dea8dd4f72e/limitador/0.log" Apr 20 20:45:52.456789 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:52.456748 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-zzg4b_b2734f44-c040-42b8-bd74-c8526ca495cc/manager/0.log" Apr 20 20:45:52.775761 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:52.775687 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fd28nh_f8bc649e-1d44-46a6-ac5a-562503a3a0e6/istio-proxy/0.log" Apr 20 20:45:53.886470 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:53.886434 2578 scope.go:117] "RemoveContainer" containerID="58bdbca88d900e46ee3888c58d6cbd9c94dd27adfc840bf55092e02ae03fdd31" Apr 20 20:45:54.793104 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:54.793066 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" event={"ID":"91ea93fa-fb3c-497b-9a9e-93be311bf366","Type":"ContainerStarted","Data":"fe4b3d060696fba90d3628fa9da94cf0328a62a662e18f65cf6d7617f2921181"} Apr 20 20:45:54.910706 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:54.910671 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611965-rr5l4"] Apr 20 20:45:55.797055 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:55.797016 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" podUID="91ea93fa-fb3c-497b-9a9e-93be311bf366" containerName="cleanup" containerID="cri-o://fe4b3d060696fba90d3628fa9da94cf0328a62a662e18f65cf6d7617f2921181" gracePeriod=30 Apr 20 20:45:57.749788 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:57.749750 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wj6w5/must-gather-277ps"] Apr 20 20:45:57.750195 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:57.750120 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1d1b1a1-feb7-437f-96a2-a465548c60d7" containerName="cleanup" Apr 20 20:45:57.750195 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:57.750132 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d1b1a1-feb7-437f-96a2-a465548c60d7" containerName="cleanup" Apr 20 20:45:57.753186 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:57.753167 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wj6w5/must-gather-277ps" Apr 20 20:45:57.755973 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:57.755947 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wj6w5\"/\"kube-root-ca.crt\"" Apr 20 20:45:57.756111 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:57.755986 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wj6w5\"/\"openshift-service-ca.crt\"" Apr 20 20:45:57.756111 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:57.755996 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wj6w5\"/\"default-dockercfg-n67tm\"" Apr 20 20:45:57.760744 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:57.760724 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wj6w5/must-gather-277ps"] Apr 20 20:45:57.848216 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:57.848181 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b31a01a-204b-472e-8638-1af12c8e9dfb-must-gather-output\") pod \"must-gather-277ps\" (UID: \"8b31a01a-204b-472e-8638-1af12c8e9dfb\") " pod="openshift-must-gather-wj6w5/must-gather-277ps" Apr 20 20:45:57.848216 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:57.848217 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdqbj\" (UniqueName: \"kubernetes.io/projected/8b31a01a-204b-472e-8638-1af12c8e9dfb-kube-api-access-hdqbj\") pod \"must-gather-277ps\" (UID: \"8b31a01a-204b-472e-8638-1af12c8e9dfb\") " pod="openshift-must-gather-wj6w5/must-gather-277ps" Apr 20 20:45:57.948689 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:57.948659 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b31a01a-204b-472e-8638-1af12c8e9dfb-must-gather-output\") pod \"must-gather-277ps\" (UID: \"8b31a01a-204b-472e-8638-1af12c8e9dfb\") " pod="openshift-must-gather-wj6w5/must-gather-277ps" Apr 20 20:45:57.948689 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:57.948691 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdqbj\" (UniqueName: \"kubernetes.io/projected/8b31a01a-204b-472e-8638-1af12c8e9dfb-kube-api-access-hdqbj\") pod \"must-gather-277ps\" (UID: \"8b31a01a-204b-472e-8638-1af12c8e9dfb\") " pod="openshift-must-gather-wj6w5/must-gather-277ps" Apr 20 20:45:57.949037 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:57.949018 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b31a01a-204b-472e-8638-1af12c8e9dfb-must-gather-output\") pod \"must-gather-277ps\" (UID: \"8b31a01a-204b-472e-8638-1af12c8e9dfb\") " pod="openshift-must-gather-wj6w5/must-gather-277ps" Apr 20 20:45:57.956931 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:57.956906 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdqbj\" (UniqueName: \"kubernetes.io/projected/8b31a01a-204b-472e-8638-1af12c8e9dfb-kube-api-access-hdqbj\") pod \"must-gather-277ps\" (UID: \"8b31a01a-204b-472e-8638-1af12c8e9dfb\") " pod="openshift-must-gather-wj6w5/must-gather-277ps" Apr 20 20:45:58.063305 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:58.063237 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wj6w5/must-gather-277ps" Apr 20 20:45:58.191834 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:58.191770 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wj6w5/must-gather-277ps"] Apr 20 20:45:58.194348 ip-10-0-134-66 kubenswrapper[2578]: W0420 20:45:58.194319 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b31a01a_204b_472e_8638_1af12c8e9dfb.slice/crio-507c7427f460e8ec4bdd914c616a33c28801731d54650f0d50fa3d00b3f65847 WatchSource:0}: Error finding container 507c7427f460e8ec4bdd914c616a33c28801731d54650f0d50fa3d00b3f65847: Status 404 returned error can't find the container with id 507c7427f460e8ec4bdd914c616a33c28801731d54650f0d50fa3d00b3f65847 Apr 20 20:45:58.809052 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:58.809018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wj6w5/must-gather-277ps" event={"ID":"8b31a01a-204b-472e-8638-1af12c8e9dfb","Type":"ContainerStarted","Data":"507c7427f460e8ec4bdd914c616a33c28801731d54650f0d50fa3d00b3f65847"} Apr 20 20:45:59.817055 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:59.817021 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wj6w5/must-gather-277ps" event={"ID":"8b31a01a-204b-472e-8638-1af12c8e9dfb","Type":"ContainerStarted","Data":"2f0fd305536bac5f5da0ec7ffd80f17211fc695b00299a33e8c440eba94ffb09"} Apr 20 20:45:59.817055 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:59.817060 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wj6w5/must-gather-277ps" event={"ID":"8b31a01a-204b-472e-8638-1af12c8e9dfb","Type":"ContainerStarted","Data":"3948eb68ac324a6b8e44087bc5c8deb78b6565666e59917518866096cd28c3f7"} Apr 20 20:45:59.832637 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:45:59.832566 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wj6w5/must-gather-277ps" podStartSLOduration=2.029657369 podStartE2EDuration="2.8325509s" podCreationTimestamp="2026-04-20 20:45:57 +0000 UTC" firstStartedPulling="2026-04-20 20:45:58.196508755 +0000 UTC m=+2090.874775790" lastFinishedPulling="2026-04-20 20:45:58.999402283 +0000 UTC m=+2091.677669321" observedRunningTime="2026-04-20 20:45:59.831529998 +0000 UTC m=+2092.509797081" watchObservedRunningTime="2026-04-20 20:45:59.8325509 +0000 UTC m=+2092.510817959" Apr 20 20:46:00.547929 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:00.547860 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-2cxr4_06fca970-a9c2-455e-87f0-219af27b7cfd/global-pull-secret-syncer/0.log" Apr 20 20:46:00.676261 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:00.676226 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-l9zp2_47af8f64-fc60-48b5-84b3-4c3888a04afa/konnectivity-agent/0.log" Apr 20 20:46:00.784110 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:00.784075 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-66.ec2.internal_19015d708a7f5256313024ebc4553800/haproxy/0.log" Apr 20 20:46:05.205228 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:05.205147 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-8rj49_67c536d1-256d-465b-8dbb-23442620b388/manager/0.log" Apr 20 20:46:05.286280 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:05.286250 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-64ngl_599f74c2-9421-424b-9b12-072566336497/registry-server/0.log" Apr 20 20:46:05.368351 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:05.368321 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-p5xxh_5e661ad4-502f-4dd7-adec-3dea8dd4f72e/limitador/0.log" Apr 20 20:46:05.450816 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:05.450674 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-zzg4b_b2734f44-c040-42b8-bd74-c8526ca495cc/manager/0.log" Apr 20 20:46:07.217609 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:07.217557 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-2kk8s_b4de5fc9-8d30-4aad-9ad4-6c3fe0910598/monitoring-plugin/0.log" Apr 20 20:46:07.337075 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:07.337013 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lzw7p_71b72aed-15d6-4289-8b96-b55802ca0734/node-exporter/0.log" Apr 20 20:46:07.403357 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:07.403282 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lzw7p_71b72aed-15d6-4289-8b96-b55802ca0734/kube-rbac-proxy/0.log" Apr 20 20:46:07.441219 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:07.441146 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lzw7p_71b72aed-15d6-4289-8b96-b55802ca0734/init-textfile/0.log" Apr 20 20:46:07.683891 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:07.683804 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_26256eac-6ede-44cd-8112-507446876176/prometheus/0.log" Apr 20 20:46:07.729626 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:07.729570 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_26256eac-6ede-44cd-8112-507446876176/config-reloader/0.log" Apr 20 20:46:07.778928 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:07.778899 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_26256eac-6ede-44cd-8112-507446876176/thanos-sidecar/0.log" Apr 20 20:46:07.817851 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:07.817816 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_26256eac-6ede-44cd-8112-507446876176/kube-rbac-proxy-web/0.log" Apr 20 20:46:07.870289 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:07.870261 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_26256eac-6ede-44cd-8112-507446876176/kube-rbac-proxy/0.log" Apr 20 20:46:07.898631 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:07.898597 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_26256eac-6ede-44cd-8112-507446876176/kube-rbac-proxy-thanos/0.log" Apr 20 20:46:07.920191 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:07.920167 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_26256eac-6ede-44cd-8112-507446876176/init-config-reloader/0.log" Apr 20 20:46:07.929339 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:07.929315 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:46:07.935263 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:07.935198 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:46:08.137970 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:08.137923 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c9795dfbf-smds4_67b94327-0f61-4d6d-ac15-45fa3a174b44/thanos-query/0.log" Apr 20 20:46:08.168400 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:08.168372 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c9795dfbf-smds4_67b94327-0f61-4d6d-ac15-45fa3a174b44/kube-rbac-proxy-web/0.log" Apr 20 20:46:08.199738 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:08.199658 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c9795dfbf-smds4_67b94327-0f61-4d6d-ac15-45fa3a174b44/kube-rbac-proxy/0.log" Apr 20 20:46:08.234346 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:08.234318 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c9795dfbf-smds4_67b94327-0f61-4d6d-ac15-45fa3a174b44/prom-label-proxy/0.log" Apr 20 20:46:08.256928 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:08.256898 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c9795dfbf-smds4_67b94327-0f61-4d6d-ac15-45fa3a174b44/kube-rbac-proxy-rules/0.log" Apr 20 20:46:08.280610 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:08.280571 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c9795dfbf-smds4_67b94327-0f61-4d6d-ac15-45fa3a174b44/kube-rbac-proxy-metrics/0.log" Apr 20 20:46:09.301768 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.301732 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62"] Apr 20 20:46:09.307298 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.307266 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.312527 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.312497 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62"] Apr 20 20:46:09.362911 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.362870 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nhk6\" (UniqueName: \"kubernetes.io/projected/5a605a5d-dadf-4f60-9c8e-20ceceed5a2a-kube-api-access-4nhk6\") pod \"perf-node-gather-daemonset-djf62\" (UID: \"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.363104 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.362941 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a605a5d-dadf-4f60-9c8e-20ceceed5a2a-lib-modules\") pod \"perf-node-gather-daemonset-djf62\" (UID: \"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.363104 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.362989 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a605a5d-dadf-4f60-9c8e-20ceceed5a2a-sys\") pod \"perf-node-gather-daemonset-djf62\" (UID: \"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.363104 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.363011 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5a605a5d-dadf-4f60-9c8e-20ceceed5a2a-proc\") pod \"perf-node-gather-daemonset-djf62\" (UID: \"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.363104 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.363086 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5a605a5d-dadf-4f60-9c8e-20ceceed5a2a-podres\") pod \"perf-node-gather-daemonset-djf62\" (UID: \"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.463499 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.463466 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nhk6\" (UniqueName: \"kubernetes.io/projected/5a605a5d-dadf-4f60-9c8e-20ceceed5a2a-kube-api-access-4nhk6\") pod \"perf-node-gather-daemonset-djf62\" (UID: \"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.463499 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.463507 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a605a5d-dadf-4f60-9c8e-20ceceed5a2a-lib-modules\") pod \"perf-node-gather-daemonset-djf62\" (UID: \"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.463770 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.463540 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a605a5d-dadf-4f60-9c8e-20ceceed5a2a-sys\") pod \"perf-node-gather-daemonset-djf62\" (UID: \"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.463770 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.463555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5a605a5d-dadf-4f60-9c8e-20ceceed5a2a-proc\") pod \"perf-node-gather-daemonset-djf62\" (UID: \"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.463770 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.463622 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5a605a5d-dadf-4f60-9c8e-20ceceed5a2a-podres\") pod \"perf-node-gather-daemonset-djf62\" (UID: \"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.463770 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.463691 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a605a5d-dadf-4f60-9c8e-20ceceed5a2a-sys\") pod \"perf-node-gather-daemonset-djf62\" (UID: \"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.463770 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.463691 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5a605a5d-dadf-4f60-9c8e-20ceceed5a2a-proc\") pod \"perf-node-gather-daemonset-djf62\" (UID: \"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.463770 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.463732 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5a605a5d-dadf-4f60-9c8e-20ceceed5a2a-podres\") pod \"perf-node-gather-daemonset-djf62\" (UID: \"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.463770 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.463735 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a605a5d-dadf-4f60-9c8e-20ceceed5a2a-lib-modules\") pod \"perf-node-gather-daemonset-djf62\" (UID: \"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.477269 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.476479 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nhk6\" (UniqueName: \"kubernetes.io/projected/5a605a5d-dadf-4f60-9c8e-20ceceed5a2a-kube-api-access-4nhk6\") pod \"perf-node-gather-daemonset-djf62\" (UID: \"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a\") " pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.620552 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.620463 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:09.798706 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.798672 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62"] Apr 20 20:46:09.861038 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:09.861007 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" event={"ID":"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a","Type":"ContainerStarted","Data":"1bd674201610231e1f0d40d20e79edef5cc24fe0ec1f40ce1f7ab66e1202a3f8"} Apr 20 20:46:10.867190 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:10.867142 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" event={"ID":"5a605a5d-dadf-4f60-9c8e-20ceceed5a2a","Type":"ContainerStarted","Data":"cad898f4edc63dbdaa900a32f4eccf520e66001376c0e34076d6bec7c74cb4d6"} Apr 20 20:46:10.867880 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:10.867851 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:10.884722 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:10.884678 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" podStartSLOduration=1.884663748 podStartE2EDuration="1.884663748s" podCreationTimestamp="2026-04-20 20:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:46:10.882054084 +0000 UTC m=+2103.560321144" watchObservedRunningTime="2026-04-20 20:46:10.884663748 +0000 UTC m=+2103.562930804" Apr 20 20:46:11.630314 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:11.630240 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8zpg6_14b4f842-f248-42a7-8dcd-51dcb4e2ab36/dns/0.log" Apr 20 20:46:11.648530 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:11.648506 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8zpg6_14b4f842-f248-42a7-8dcd-51dcb4e2ab36/kube-rbac-proxy/0.log" Apr 20 20:46:11.737571 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:11.737541 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nd6kq_3246feba-bd0f-4518-845a-c9025f63125b/dns-node-resolver/0.log" Apr 20 20:46:12.270843 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:12.270820 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xq679_b5e335ef-aa11-4058-9448-d253b1a467c3/node-ca/0.log" Apr 20 20:46:13.028355 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:13.028327 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fd28nh_f8bc649e-1d44-46a6-ac5a-562503a3a0e6/istio-proxy/0.log" Apr 20 20:46:13.755590 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:13.755539 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jp7bd_3e10e1c4-64b7-4da9-be2e-38882fc2fbb7/serve-healthcheck-canary/0.log" Apr 20 20:46:14.199269 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.199236 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-8hmpw_210a9def-653b-4320-b877-97fe51dc9677/insights-operator/0.log" Apr 20 20:46:14.200443 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.200422 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-8hmpw_210a9def-653b-4320-b877-97fe51dc9677/insights-operator/1.log" Apr 20 20:46:14.274877 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.274850 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rpr4q_84d2f547-e0e6-4189-b17f-bfe907634d26/kube-rbac-proxy/0.log" Apr 20 20:46:14.292493 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.292465 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rpr4q_84d2f547-e0e6-4189-b17f-bfe907634d26/exporter/0.log" Apr 20 20:46:14.313295 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.313269 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rpr4q_84d2f547-e0e6-4189-b17f-bfe907634d26/extractor/0.log" Apr 20 20:46:14.768018 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.767954 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" Apr 20 20:46:14.815518 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.814963 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gdj9\" (UniqueName: \"kubernetes.io/projected/91ea93fa-fb3c-497b-9a9e-93be311bf366-kube-api-access-2gdj9\") pod \"91ea93fa-fb3c-497b-9a9e-93be311bf366\" (UID: \"91ea93fa-fb3c-497b-9a9e-93be311bf366\") " Apr 20 20:46:14.818259 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.818229 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91ea93fa-fb3c-497b-9a9e-93be311bf366-kube-api-access-2gdj9" (OuterVolumeSpecName: "kube-api-access-2gdj9") pod "91ea93fa-fb3c-497b-9a9e-93be311bf366" (UID: "91ea93fa-fb3c-497b-9a9e-93be311bf366"). InnerVolumeSpecName "kube-api-access-2gdj9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:46:14.885005 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.884972 2578 generic.go:358] "Generic (PLEG): container finished" podID="91ea93fa-fb3c-497b-9a9e-93be311bf366" containerID="fe4b3d060696fba90d3628fa9da94cf0328a62a662e18f65cf6d7617f2921181" exitCode=6 Apr 20 20:46:14.885156 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.885053 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" Apr 20 20:46:14.885156 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.885058 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" event={"ID":"91ea93fa-fb3c-497b-9a9e-93be311bf366","Type":"ContainerDied","Data":"fe4b3d060696fba90d3628fa9da94cf0328a62a662e18f65cf6d7617f2921181"} Apr 20 20:46:14.885156 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.885097 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611965-rr5l4" event={"ID":"91ea93fa-fb3c-497b-9a9e-93be311bf366","Type":"ContainerDied","Data":"d9586c0e220b737bb9b820652365acf159ee81829e05bf3e061456cea9a60c78"} Apr 20 20:46:14.885156 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.885115 2578 scope.go:117] "RemoveContainer" containerID="fe4b3d060696fba90d3628fa9da94cf0328a62a662e18f65cf6d7617f2921181" Apr 20 20:46:14.895878 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.895805 2578 scope.go:117] "RemoveContainer" containerID="58bdbca88d900e46ee3888c58d6cbd9c94dd27adfc840bf55092e02ae03fdd31" Apr 20 20:46:14.903472 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.903456 2578 scope.go:117] "RemoveContainer" containerID="fe4b3d060696fba90d3628fa9da94cf0328a62a662e18f65cf6d7617f2921181" Apr 20 20:46:14.903757 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:46:14.903734 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe4b3d060696fba90d3628fa9da94cf0328a62a662e18f65cf6d7617f2921181\": container with ID starting with fe4b3d060696fba90d3628fa9da94cf0328a62a662e18f65cf6d7617f2921181 not found: ID does not exist" containerID="fe4b3d060696fba90d3628fa9da94cf0328a62a662e18f65cf6d7617f2921181" Apr 20 20:46:14.903830 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.903771 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe4b3d060696fba90d3628fa9da94cf0328a62a662e18f65cf6d7617f2921181"} err="failed to get container status \"fe4b3d060696fba90d3628fa9da94cf0328a62a662e18f65cf6d7617f2921181\": rpc error: code = NotFound desc = could not find container \"fe4b3d060696fba90d3628fa9da94cf0328a62a662e18f65cf6d7617f2921181\": container with ID starting with fe4b3d060696fba90d3628fa9da94cf0328a62a662e18f65cf6d7617f2921181 not found: ID does not exist" Apr 20 20:46:14.903830 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.903796 2578 scope.go:117] "RemoveContainer" containerID="58bdbca88d900e46ee3888c58d6cbd9c94dd27adfc840bf55092e02ae03fdd31" Apr 20 20:46:14.904061 ip-10-0-134-66 kubenswrapper[2578]: E0420 20:46:14.904043 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58bdbca88d900e46ee3888c58d6cbd9c94dd27adfc840bf55092e02ae03fdd31\": container with ID starting with 58bdbca88d900e46ee3888c58d6cbd9c94dd27adfc840bf55092e02ae03fdd31 not found: ID does not exist" containerID="58bdbca88d900e46ee3888c58d6cbd9c94dd27adfc840bf55092e02ae03fdd31" Apr 20 20:46:14.904125 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.904068 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58bdbca88d900e46ee3888c58d6cbd9c94dd27adfc840bf55092e02ae03fdd31"} err="failed to get container status \"58bdbca88d900e46ee3888c58d6cbd9c94dd27adfc840bf55092e02ae03fdd31\": rpc error: code = NotFound desc = could not find container \"58bdbca88d900e46ee3888c58d6cbd9c94dd27adfc840bf55092e02ae03fdd31\": container with ID starting with 58bdbca88d900e46ee3888c58d6cbd9c94dd27adfc840bf55092e02ae03fdd31 not found: ID does not exist" Apr 20 20:46:14.914599 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.913848 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611965-rr5l4"] Apr 20 20:46:14.914599 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.913898 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611965-rr5l4"] Apr 20 20:46:14.916631 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:14.916607 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2gdj9\" (UniqueName: \"kubernetes.io/projected/91ea93fa-fb3c-497b-9a9e-93be311bf366-kube-api-access-2gdj9\") on node \"ip-10-0-134-66.ec2.internal\" DevicePath \"\"" Apr 20 20:46:15.891432 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:15.891387 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91ea93fa-fb3c-497b-9a9e-93be311bf366" path="/var/lib/kubelet/pods/91ea93fa-fb3c-497b-9a9e-93be311bf366/volumes" Apr 20 20:46:16.473171 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:16.473144 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7f7bf89c4-747pn_6be6a96a-5585-4d1d-b02b-8ab01176b90b/manager/0.log" Apr 20 20:46:17.605416 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:17.605390 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-54d459c768-k9r6t_edabbbcf-9ce5-4f2a-a60a-5f45697c0006/manager/0.log" Apr 20 20:46:17.651483 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:17.651459 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-vx4zj_0643c57f-304b-4062-90f3-7099c45b7e07/openshift-lws-operator/0.log" Apr 20 20:46:17.885341 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:17.885279 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wj6w5/perf-node-gather-daemonset-djf62" Apr 20 20:46:22.061536 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:22.061513 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-flvcg_27f6d5e5-7798-499d-a755-3e7bfa905be0/migrator/0.log" Apr 20 20:46:22.087822 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:22.087793 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-flvcg_27f6d5e5-7798-499d-a755-3e7bfa905be0/graceful-termination/0.log" Apr 20 20:46:23.333301 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:23.333273 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9fj49_188b7548-da8a-4c62-b93e-536b9b6e09ea/kube-multus/0.log" Apr 20 20:46:23.637599 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:23.637499 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdxkm_2ff55742-6058-4a56-941a-aec639653897/kube-multus-additional-cni-plugins/0.log" Apr 20 20:46:23.655861 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:23.655834 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdxkm_2ff55742-6058-4a56-941a-aec639653897/egress-router-binary-copy/0.log" Apr 20 20:46:23.675913 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:23.675889 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdxkm_2ff55742-6058-4a56-941a-aec639653897/cni-plugins/0.log" Apr 20 20:46:23.695256 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:23.695223 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdxkm_2ff55742-6058-4a56-941a-aec639653897/bond-cni-plugin/0.log" Apr 20 20:46:23.719191 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:23.719168 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdxkm_2ff55742-6058-4a56-941a-aec639653897/routeoverride-cni/0.log" Apr 20 20:46:23.740254 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:23.740234 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdxkm_2ff55742-6058-4a56-941a-aec639653897/whereabouts-cni-bincopy/0.log" Apr 20 20:46:23.759457 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:23.759427 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdxkm_2ff55742-6058-4a56-941a-aec639653897/whereabouts-cni/0.log" Apr 20 20:46:23.930061 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:23.930034 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zc8kt_a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d/network-metrics-daemon/0.log" Apr 20 20:46:23.947080 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:23.947060 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zc8kt_a5778b2a-3cb2-40b9-a6ac-ba4bc47b3b7d/kube-rbac-proxy/0.log" Apr 20 20:46:25.330415 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:25.330388 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-controller/0.log" Apr 20 20:46:25.345198 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:25.345175 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/0.log" Apr 20 20:46:25.355364 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:25.355336 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovn-acl-logging/1.log" Apr 20 20:46:25.374037 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:25.374019 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/kube-rbac-proxy-node/0.log" Apr 20 20:46:25.392918 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:25.392893 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 20:46:25.409064 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:25.409040 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/northd/0.log" Apr 20 20:46:25.426282 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:25.426258 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/nbdb/0.log" Apr 20 20:46:25.443616 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:25.443545 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/sbdb/0.log" Apr 20 20:46:25.564252 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:25.564222 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zg9wv_4d7257b5-0312-4d95-b84e-bf7589326417/ovnkube-controller/0.log" Apr 20 20:46:26.607741 ip-10-0-134-66 kubenswrapper[2578]: I0420 20:46:26.607713 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-g75rf_c5e48953-ddd6-4f02-b2d7-7a42dede535f/network-check-target-container/0.log"