Apr 16 18:28:17.562592 ip-10-0-135-146 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 18:28:17.562603 ip-10-0-135-146 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 18:28:17.562612 ip-10-0-135-146 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 18:28:17.562906 ip-10-0-135-146 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 18:28:27.591185 ip-10-0-135-146 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 18:28:27.591206 ip-10-0-135-146 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f7801e9898664f2399d344d6c849fe81 -- Apr 16 18:30:48.787450 ip-10-0-135-146 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:30:49.261136 ip-10-0-135-146 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:49.261136 ip-10-0-135-146 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:30:49.261136 ip-10-0-135-146 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:49.261136 ip-10-0-135-146 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:30:49.261136 ip-10-0-135-146 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:49.263064 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.262974 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:30:49.265269 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265253 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:49.265269 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265268 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265272 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265276 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265279 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265282 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265285 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265288 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265291 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265294 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265296 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265299 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265301 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265305 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265307 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265310 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265313 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265315 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265318 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265321 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:49.265332 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265323 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265326 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265328 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265338 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265343 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265347 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265350 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265353 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265356 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265359 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265362 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265371 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265374 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265377 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265380 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265382 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265384 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265387 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265390 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:49.265790 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265393 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265395 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265398 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265400 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265402 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265405 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265408 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265410 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265413 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265416 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265419 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265422 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265424 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265427 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265429 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265433 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265436 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265439 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265441 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265444 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:49.266298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265448 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265451 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265453 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265456 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265458 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265461 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265463 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265466 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265468 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265470 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265473 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265476 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265478 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265481 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265483 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265486 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265489 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265492 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265495 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265497 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:49.266868 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265500 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265502 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265505 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265507 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265509 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265512 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265515 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265898 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265904 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265907 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265910 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265913 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265915 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265918 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265921 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265924 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265927 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265930 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265934 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:49.267421 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265937 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265940 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265943 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265945 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265948 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265951 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265954 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265956 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265959 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265971 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265974 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265977 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265980 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265983 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265986 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265989 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265991 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265994 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265997 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.265999 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:49.267889 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266002 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266005 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266008 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266011 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266013 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266016 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266018 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266021 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266024 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266027 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266030 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266033 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266035 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266038 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266040 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266043 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266045 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266048 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266050 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266053 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:49.268402 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266056 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266058 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266061 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266064 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266066 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266069 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266071 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266074 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266076 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266079 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266082 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266085 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266088 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266092 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266094 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266098 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266101 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266103 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266106 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266109 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:49.268892 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266111 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266114 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266116 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266119 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266121 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266124 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266127 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266129 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266132 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266134 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266136 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266139 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266158 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266162 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266241 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266249 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266255 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266260 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266266 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266269 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266274 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:30:49.269404 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266279 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266283 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266286 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266289 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266292 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266296 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266299 2580 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266302 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266305 2580 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266308 2580 flags.go:64] FLAG: --cloud-config="" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266311 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266314 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266320 2580 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266323 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266326 2580 flags.go:64] FLAG: --config-dir="" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266329 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266332 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266336 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266339 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266342 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266346 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266349 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266352 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266355 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266358 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:30:49.269942 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266361 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266365 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266369 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266375 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266378 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266381 2580 flags.go:64] FLAG: --enable-server="true" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266384 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266389 2580 flags.go:64] FLAG: --event-burst="100" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266392 2580 flags.go:64] FLAG: --event-qps="50" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266395 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266398 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266401 2580 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266405 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266408 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266411 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266415 2580 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266418 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266421 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266424 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266427 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266430 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266433 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266436 2580 flags.go:64] FLAG: --feature-gates="" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266440 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266443 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:30:49.270582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266446 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266449 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266452 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266455 2580 flags.go:64] FLAG: --help="false" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266458 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266461 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266464 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266467 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266471 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266475 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266481 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266484 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266487 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266489 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266493 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266496 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266499 2580 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266502 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266504 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266507 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266510 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266513 2580 flags.go:64] FLAG: --lock-file="" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266516 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266519 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:30:49.271210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266522 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266527 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266530 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266533 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266536 2580 flags.go:64] FLAG: --logging-format="text" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266539 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266542 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266545 2580 flags.go:64] FLAG: --manifest-url="" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266548 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266552 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266556 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266560 2580 flags.go:64] FLAG: --max-pods="110" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266563 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266566 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266568 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266571 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266575 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266578 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266583 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266590 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266594 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266597 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266600 2580 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:30:49.271797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266603 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266609 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266611 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266615 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266618 2580 flags.go:64] FLAG: --port="10250" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266621 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266624 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0bbb3ad8a2f3fbff8" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266627 2580 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266630 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266633 2580 flags.go:64] FLAG: --register-node="true" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266636 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266639 2580 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266643 2580 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266646 2580 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266648 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266651 2580 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266655 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266658 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266662 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266665 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266667 2580 flags.go:64] FLAG: --runonce="false" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266670 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266673 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266676 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266679 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266683 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:30:49.272397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266686 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266689 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266693 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266697 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266700 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266703 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266706 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266709 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266712 2580 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266715 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266720 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266723 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266725 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266730 2580 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266733 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266736 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266738 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266741 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266745 2580 flags.go:64] FLAG: --v="2" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266749 2580 flags.go:64] FLAG: --version="false" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266753 2580 flags.go:64] FLAG: --vmodule="" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266757 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.266761 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266850 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:49.273052 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266856 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266859 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266863 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266866 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266869 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266872 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266875 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266878 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266881 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266883 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266888 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266890 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266893 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266896 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266899 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266902 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266905 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266908 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266911 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266913 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:49.273661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266916 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266918 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266921 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266925 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266927 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266930 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266933 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266935 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266938 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266940 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266943 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266947 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266951 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266953 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266956 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266959 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266961 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266964 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266967 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:49.274168 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266970 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266973 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266975 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266980 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266982 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266985 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266988 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266990 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266993 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266996 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.266999 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267001 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267004 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267007 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267009 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267012 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267014 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267017 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267020 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267022 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:49.274634 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267025 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267027 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267030 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267032 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267035 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267037 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267040 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267043 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267046 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267048 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267051 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267053 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267058 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267060 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267063 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267067 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267069 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267072 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267074 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267081 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:49.275120 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267084 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:49.275632 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267086 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:49.275632 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267089 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:49.275632 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267092 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:49.275632 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267094 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:49.275632 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.267097 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:49.275632 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.267810 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:49.275632 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.275106 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:30:49.275632 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.275122 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:30:49.275632 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275186 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:49.275632 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275193 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:49.275632 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275197 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:49.275632 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275202 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:49.275632 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275205 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:49.275632 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275208 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:49.275632 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275211 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275214 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275216 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275220 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275223 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275226 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275228 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275231 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275233 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275236 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275238 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275241 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275243 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275246 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275249 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275252 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275255 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275258 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275261 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:49.276044 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275263 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275266 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275269 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275271 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275273 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275276 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275280 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275283 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275286 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275288 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275291 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275294 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275296 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275299 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275302 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275304 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275307 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275310 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275312 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275314 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:49.276518 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275317 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275320 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275322 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275325 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275327 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275330 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275332 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275335 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275337 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275339 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275342 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275345 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275348 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275351 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275353 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275356 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275358 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275361 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275364 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275367 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:49.277015 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275371 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275374 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275377 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275379 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275382 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275384 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275396 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275399 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275402 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275405 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275408 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275410 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275413 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275415 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275418 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275421 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275423 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275426 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275429 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275431 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:49.277570 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275434 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:49.278057 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.275439 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:49.278057 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275536 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:49.278057 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275543 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:49.278057 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275546 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:49.278057 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275549 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:49.278057 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275552 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:49.278057 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275554 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:49.278057 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275557 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:49.278057 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275560 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:49.278057 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275567 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:49.278057 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275570 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:49.278057 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275573 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:49.278057 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275576 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:49.278057 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275579 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:49.278057 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275581 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275586 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275590 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275593 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275596 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275598 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275601 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275604 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275606 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275609 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275611 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275614 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275617 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275619 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275622 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275624 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275627 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275629 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275632 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:49.278428 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275635 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275638 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275641 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275644 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275646 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275649 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275652 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275654 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275657 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275660 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275663 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275665 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275668 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275670 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275673 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275675 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275678 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275681 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275683 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275686 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:49.278899 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275688 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275691 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275693 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275696 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275699 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275701 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275704 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275707 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275710 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275713 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275716 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275719 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275721 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275724 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275726 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275730 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275732 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275735 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275738 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275740 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:49.279459 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275743 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275746 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275749 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275751 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275754 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275756 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275758 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275761 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275763 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275766 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275768 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275771 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275774 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:49.275776 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.275781 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.276549 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:30:49.279998 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.279473 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:30:49.280491 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.280479 2580 server.go:1019] "Starting client certificate rotation" Apr 16 18:30:49.280607 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.280587 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:30:49.281448 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.281434 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:30:49.310547 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.310522 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:30:49.317922 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.317901 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:30:49.339737 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.339714 2580 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:30:49.346613 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.346596 2580 log.go:25] "Validated CRI v1 image API" Apr 16 18:30:49.347829 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.347810 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:30:49.351700 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.351675 2580 fs.go:135] Filesystem UUIDs: map[111298ad-c278-429a-97fd-28b1b532fe3c:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a03f3878-c536-4ae4-82e3-c32a7afa2fd1:/dev/nvme0n1p4] Apr 16 18:30:49.351793 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.351698 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:30:49.355438 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.355416 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:30:49.356852 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.356729 2580 manager.go:217] Machine: {Timestamp:2026-04-16 18:30:49.355517069 +0000 UTC m=+0.444780654 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3093493 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b4dc6d1e6a1e9ecb1ad932bb7f229 SystemUUID:ec2b4dc6-d1e6-a1e9-ecb1-ad932bb7f229 BootID:f7801e98-9866-4f23-99d3-44d6c849fe81 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:bb:71:84:ef:29 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:bb:71:84:ef:29 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ee:7a:12:76:b5:a5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:30:49.356852 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.356843 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:30:49.357010 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.356951 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:30:49.358810 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.358783 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:30:49.358976 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.358813 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-146.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:30:49.359064 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.358993 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:30:49.359064 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.359006 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:30:49.359064 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.359030 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:30:49.359904 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.359892 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:30:49.361722 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.361710 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:30:49.362024 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.362011 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:30:49.364517 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.364505 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:30:49.364589 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.364524 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:30:49.364589 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.364543 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:30:49.364589 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.364559 2580 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:30:49.364589 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.364572 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:30:49.365676 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.365663 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:30:49.365741 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.365687 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:30:49.368824 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.368809 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:30:49.370628 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.370615 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:30:49.372205 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.372193 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:30:49.372265 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.372210 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:30:49.372265 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.372217 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:30:49.372265 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.372222 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:30:49.372265 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.372228 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:30:49.372265 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.372234 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:30:49.372265 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.372240 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:30:49.372265 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.372245 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:30:49.372265 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.372252 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:30:49.372265 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.372259 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:30:49.372495 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.372276 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:30:49.372495 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.372284 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:30:49.374299 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.374290 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:30:49.374299 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.374300 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:30:49.377364 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.377330 2580 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-146.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:30:49.377423 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.377400 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-146.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:30:49.377454 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.377400 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:30:49.378307 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.378295 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:30:49.378350 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.378333 2580 server.go:1295] "Started kubelet" Apr 16 18:30:49.378482 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.378437 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:30:49.378537 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.378519 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:30:49.378577 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.378454 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:30:49.379206 ip-10-0-135-146 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:30:49.381753 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.381728 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tpg2g" Apr 16 18:30:49.381820 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.381783 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:30:49.384000 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.383980 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:30:49.388227 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.388210 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tpg2g" Apr 16 18:30:49.388838 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.387826 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-146.ec2.internal.18a6e9df8be5bb8c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-146.ec2.internal,UID:ip-10-0-135-146.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-146.ec2.internal,},FirstTimestamp:2026-04-16 18:30:49.378306956 +0000 UTC m=+0.467570539,LastTimestamp:2026-04-16 18:30:49.378306956 +0000 UTC m=+0.467570539,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-146.ec2.internal,}" Apr 16 18:30:49.389751 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.389736 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:30:49.389813 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.389787 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:30:49.390420 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.390402 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:30:49.390496 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.390426 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:30:49.390496 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.390431 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:30:49.390496 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.390462 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:30:49.390642 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.390543 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:30:49.390642 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.390553 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:30:49.390642 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.390616 2580 factory.go:55] Registering systemd factory Apr 16 18:30:49.390642 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.390631 2580 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:30:49.390810 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.390723 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:49.390858 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.390831 2580 factory.go:153] Registering CRI-O factory Apr 16 18:30:49.390858 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.390854 2580 factory.go:223] Registration of the crio container factory successfully Apr 16 18:30:49.390949 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.390917 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:30:49.390949 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.390942 2580 factory.go:103] Registering Raw factory Apr 16 18:30:49.391029 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.390959 2580 manager.go:1196] Started watching for new ooms in manager Apr 16 18:30:49.391865 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.391850 2580 manager.go:319] Starting recovery of all containers Apr 16 18:30:49.397867 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.397699 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:49.400768 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.400746 2580 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-146.ec2.internal\" not found" node="ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.402713 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.402699 2580 manager.go:324] Recovery completed Apr 16 18:30:49.403945 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.403920 2580 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 18:30:49.408002 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.407986 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:49.410613 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.410598 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:49.410691 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.410626 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:49.410691 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.410636 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:49.411137 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.411119 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:30:49.411137 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.411134 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:30:49.411295 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.411167 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:30:49.414003 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.413989 2580 policy_none.go:49] "None policy: Start" Apr 16 18:30:49.414100 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.414008 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:30:49.414100 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.414021 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:30:49.454726 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.454710 2580 manager.go:341] "Starting Device Plugin manager" Apr 16 18:30:49.460254 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.454752 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:30:49.460254 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.454763 2580 server.go:85] "Starting device plugin registration server" Apr 16 18:30:49.460254 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.455014 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:30:49.460254 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.455027 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:30:49.460254 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.455118 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:30:49.460254 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.455209 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:30:49.460254 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.455217 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:30:49.460254 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.455678 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:30:49.460254 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.455715 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:49.533003 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.532924 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:30:49.534167 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.534136 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:30:49.534288 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.534179 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:30:49.534288 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.534204 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:30:49.534288 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.534210 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:30:49.534288 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.534247 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:30:49.538581 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.538561 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:49.555730 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.555707 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:49.557136 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.557121 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:49.557221 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.557169 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:49.557221 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.557182 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:49.557221 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.557204 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.565977 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.565956 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.566066 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.565979 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-146.ec2.internal\": node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:49.582970 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.582948 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:49.636117 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.636084 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-146.ec2.internal"] Apr 16 18:30:49.636215 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.636180 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:49.640642 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.640627 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:49.640719 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.640656 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:49.640719 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.640670 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:49.642238 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.642226 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:49.642400 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.642385 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.642434 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.642417 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:49.643563 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.643547 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:49.643651 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.643578 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:49.643651 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.643548 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:49.643651 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.643619 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:49.643651 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.643631 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:49.643651 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.643592 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:49.645627 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.645609 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.645718 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.645633 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:49.646356 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.646343 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:49.646412 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.646365 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:49.646412 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.646375 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:49.676895 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.676874 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-146.ec2.internal\" not found" node="ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.681363 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.681345 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-146.ec2.internal\" not found" node="ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.683395 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.683370 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:49.693063 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.693044 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6609aa2fa347a80cd94ed61d2e93259b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal\" (UID: \"6609aa2fa347a80cd94ed61d2e93259b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.693160 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.693077 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6609aa2fa347a80cd94ed61d2e93259b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal\" (UID: \"6609aa2fa347a80cd94ed61d2e93259b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.693160 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.693100 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c7acda094cc50b1e38ea78cc83b01fd2-config\") pod \"kube-apiserver-proxy-ip-10-0-135-146.ec2.internal\" (UID: \"c7acda094cc50b1e38ea78cc83b01fd2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.784055 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.783975 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:49.793701 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.793674 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6609aa2fa347a80cd94ed61d2e93259b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal\" (UID: \"6609aa2fa347a80cd94ed61d2e93259b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.793797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.793706 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c7acda094cc50b1e38ea78cc83b01fd2-config\") pod \"kube-apiserver-proxy-ip-10-0-135-146.ec2.internal\" (UID: \"c7acda094cc50b1e38ea78cc83b01fd2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.793797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.793724 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6609aa2fa347a80cd94ed61d2e93259b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal\" (UID: \"6609aa2fa347a80cd94ed61d2e93259b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.793797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.793770 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6609aa2fa347a80cd94ed61d2e93259b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal\" (UID: \"6609aa2fa347a80cd94ed61d2e93259b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.793797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.793770 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6609aa2fa347a80cd94ed61d2e93259b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal\" (UID: \"6609aa2fa347a80cd94ed61d2e93259b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.793797 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.793784 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c7acda094cc50b1e38ea78cc83b01fd2-config\") pod \"kube-apiserver-proxy-ip-10-0-135-146.ec2.internal\" (UID: \"c7acda094cc50b1e38ea78cc83b01fd2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.885135 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.885086 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:49.980654 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.980612 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.983875 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:49.983858 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-146.ec2.internal" Apr 16 18:30:49.985993 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:49.985971 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:50.086874 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:50.086789 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:50.187316 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:50.187285 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:50.280775 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:50.280752 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:30:50.281260 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:50.280915 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:30:50.281260 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:50.280934 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:30:50.287905 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:50.287881 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:50.388481 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:50.388399 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:50.390085 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:50.390056 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:30:50.390247 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:50.390101 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:25:49 +0000 UTC" deadline="2028-01-19 09:08:01.210163656 +0000 UTC" Apr 16 18:30:50.390247 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:50.390170 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15422h37m10.819999148s" Apr 16 18:30:50.399378 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:50.399358 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:30:50.436487 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:50.436460 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-5khct" Apr 16 18:30:50.444607 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:50.444588 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-5khct" Apr 16 18:30:50.489136 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:50.489109 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:50.576131 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:50.576094 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7acda094cc50b1e38ea78cc83b01fd2.slice/crio-33854908cb6399dfd4bc5b25a591e402feee1865640bbbd93deb50b2fe66be2c WatchSource:0}: Error finding container 33854908cb6399dfd4bc5b25a591e402feee1865640bbbd93deb50b2fe66be2c: Status 404 returned error can't find the container with id 33854908cb6399dfd4bc5b25a591e402feee1865640bbbd93deb50b2fe66be2c Apr 16 18:30:50.576609 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:50.576589 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6609aa2fa347a80cd94ed61d2e93259b.slice/crio-393e73586ead693aba5c82fba16cbe2c5f48ea4c4d321235ee89123eb4866fab WatchSource:0}: Error finding container 393e73586ead693aba5c82fba16cbe2c5f48ea4c4d321235ee89123eb4866fab: Status 404 returned error can't find the container with id 393e73586ead693aba5c82fba16cbe2c5f48ea4c4d321235ee89123eb4866fab Apr 16 18:30:50.581561 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:50.581546 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:30:50.590160 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:50.590128 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:50.691012 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:50.690934 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:50.791304 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:50.791259 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:50.892017 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:50.891974 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-146.ec2.internal\" not found" Apr 16 18:30:50.943699 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:50.943632 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:50.954643 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:50.954614 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:50.990424 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:50.990385 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal" Apr 16 18:30:51.002015 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.001990 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:30:51.003013 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.002991 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-146.ec2.internal" Apr 16 18:30:51.009276 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.009252 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:30:51.365580 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.365546 2580 apiserver.go:52] "Watching apiserver" Apr 16 18:30:51.370888 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.370863 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:30:51.371358 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.371331 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-ncxt7","openshift-multus/network-metrics-daemon-sjjgw","kube-system/konnectivity-agent-22wdj","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw","openshift-image-registry/node-ca-jz4gk","openshift-multus/multus-additional-cni-plugins-zrvtq","openshift-network-diagnostics/network-check-target-2bwhf","openshift-network-operator/iptables-alerter-vm9vr","openshift-ovn-kubernetes/ovnkube-node-6nzqh","kube-system/kube-apiserver-proxy-ip-10-0-135-146.ec2.internal","openshift-cluster-node-tuning-operator/tuned-sp2hq","openshift-dns/node-resolver-sxsqt","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal"] Apr 16 18:30:51.373433 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.373410 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:30:51.373538 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:51.373485 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:30:51.376246 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.376226 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:30:51.376343 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:51.376300 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:30:51.378033 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.378014 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-22wdj" Apr 16 18:30:51.379823 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.379764 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.379914 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.379871 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jz4gk" Apr 16 18:30:51.380663 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.380630 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:30:51.380775 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.380670 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xs87t\"" Apr 16 18:30:51.380775 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.380670 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:30:51.381451 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.381408 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.381878 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.381846 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:30:51.382345 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.381999 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:30:51.382345 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.382177 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:30:51.382494 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.382349 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jmdtp\"" Apr 16 18:30:51.382494 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.382435 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:30:51.382596 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.382517 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:30:51.382596 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.382546 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:30:51.382689 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.382663 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-9rx88\"" Apr 16 18:30:51.383012 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.382996 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.384490 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.384467 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:30:51.384600 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.384535 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:30:51.384756 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.384738 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vm9vr" Apr 16 18:30:51.384960 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.384923 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:30:51.386393 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.385320 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mzmbh\"" Apr 16 18:30:51.386393 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.386086 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:30:51.386393 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.386233 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:30:51.386393 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.386248 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:30:51.387164 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.386907 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:30:51.387164 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.387102 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-sqt88\"" Apr 16 18:30:51.388356 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.387301 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-bp4wk\"" Apr 16 18:30:51.388839 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.388476 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:30:51.388839 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.388657 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:30:51.390888 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.389745 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.393217 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.392966 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:30:51.393217 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.392999 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:30:51.393427 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.393266 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:30:51.393427 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.393344 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:30:51.393576 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.393560 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:30:51.393706 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.393675 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:30:51.393895 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.393854 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-v56sb\"" Apr 16 18:30:51.394239 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.394212 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sxsqt" Apr 16 18:30:51.394336 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.394241 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.396947 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.396855 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:30:51.397061 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.396952 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:30:51.397124 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.397100 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-s267k\"" Apr 16 18:30:51.397198 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.397172 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-dxrxn\"" Apr 16 18:30:51.397429 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.397413 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:30:51.397530 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.397515 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:30:51.401638 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.401618 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-run-netns\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.401723 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.401656 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6a615bb3-b76b-4c92-9085-2d164914c2aa-ovnkube-script-lib\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.401723 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.401683 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/15291a43-2092-47ea-b8ab-a7363155516e-agent-certs\") pod \"konnectivity-agent-22wdj\" (UID: \"15291a43-2092-47ea-b8ab-a7363155516e\") " pod="kube-system/konnectivity-agent-22wdj" Apr 16 18:30:51.401723 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.401705 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-cnibin\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.401892 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.401729 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.401892 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.401752 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvr9m\" (UniqueName: \"kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m\") pod \"network-check-target-2bwhf\" (UID: \"ba95d7cd-292c-41ec-8417-d3768d65716d\") " pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:30:51.401892 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.401776 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5pvq\" (UniqueName: \"kubernetes.io/projected/6d14a25b-4a6f-4f1f-9754-0a6038be8b58-kube-api-access-z5pvq\") pod \"iptables-alerter-vm9vr\" (UID: \"6d14a25b-4a6f-4f1f-9754-0a6038be8b58\") " pod="openshift-network-operator/iptables-alerter-vm9vr" Apr 16 18:30:51.401892 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.401798 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-os-release\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.401892 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.401823 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6d14a25b-4a6f-4f1f-9754-0a6038be8b58-iptables-alerter-script\") pod \"iptables-alerter-vm9vr\" (UID: \"6d14a25b-4a6f-4f1f-9754-0a6038be8b58\") " pod="openshift-network-operator/iptables-alerter-vm9vr" Apr 16 18:30:51.401892 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.401845 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-sys\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.401892 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.401878 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjrtg\" (UniqueName: \"kubernetes.io/projected/266cf30f-494c-4756-9005-740f5b60c795-kube-api-access-kjrtg\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.402244 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.401902 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rzrc\" (UniqueName: \"kubernetes.io/projected/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-kube-api-access-8rzrc\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.402244 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.401936 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-run-multus-certs\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.402244 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.401964 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-etc-openvswitch\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.402244 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.401986 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-cni-netd\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.402244 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402008 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-kubernetes\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.402244 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402031 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e1300a99-4d7a-47e3-9d8e-404608c14ae7-cni-binary-copy\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.402244 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402058 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-var-lib-cni-bin\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.402244 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402080 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-run-openvswitch\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.402244 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402109 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-run-ovn\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.402244 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402133 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-device-dir\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.402244 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402182 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/15291a43-2092-47ea-b8ab-a7363155516e-konnectivity-ca\") pod \"konnectivity-agent-22wdj\" (UID: \"15291a43-2092-47ea-b8ab-a7363155516e\") " pod="kube-system/konnectivity-agent-22wdj" Apr 16 18:30:51.402244 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402213 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-slash\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.402244 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402235 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a615bb3-b76b-4c92-9085-2d164914c2aa-ovn-node-metrics-cert\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402267 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-lib-modules\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402298 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-cni-binary-copy\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402320 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-kubelet\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402349 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-run-systemd\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402372 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402396 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-sysconfig\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402420 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccd57\" (UniqueName: \"kubernetes.io/projected/642f0536-3a1b-4d5c-bb3d-e7128392b218-kube-api-access-ccd57\") pod \"network-metrics-daemon-sjjgw\" (UID: \"642f0536-3a1b-4d5c-bb3d-e7128392b218\") " pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402444 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402502 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-cnibin\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402526 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-multus-socket-dir-parent\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402592 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-node-log\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402617 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402640 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3622caf8-9f1b-49f3-8219-6df05c25252f-host\") pod \"node-ca-jz4gk\" (UID: \"3622caf8-9f1b-49f3-8219-6df05c25252f\") " pod="openshift-image-registry/node-ca-jz4gk" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402665 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402691 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-system-cni-dir\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.402836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402805 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-run-k8s-cni-cncf-io\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402836 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-hostroot\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402859 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-cni-bin\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402885 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-modprobe-d\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402908 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/266cf30f-494c-4756-9005-740f5b60c795-etc-tuned\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402931 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/266cf30f-494c-4756-9005-740f5b60c795-tmp\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402956 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-var-lib-openvswitch\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.402983 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a615bb3-b76b-4c92-9085-2d164914c2aa-ovnkube-config\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403020 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qsw4\" (UniqueName: \"kubernetes.io/projected/3622caf8-9f1b-49f3-8219-6df05c25252f-kube-api-access-8qsw4\") pod \"node-ca-jz4gk\" (UID: \"3622caf8-9f1b-49f3-8219-6df05c25252f\") " pod="openshift-image-registry/node-ca-jz4gk" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403047 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-var-lib-cni-multus\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403071 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-multus-conf-dir\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403096 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-system-cni-dir\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403121 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-var-lib-kubelet\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403187 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-registration-dir\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403227 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3622caf8-9f1b-49f3-8219-6df05c25252f-serviceca\") pod \"node-ca-jz4gk\" (UID: \"3622caf8-9f1b-49f3-8219-6df05c25252f\") " pod="openshift-image-registry/node-ca-jz4gk" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403255 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-sysctl-conf\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403282 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-run-netns\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.403411 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403311 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sjb2\" (UniqueName: \"kubernetes.io/projected/e1300a99-4d7a-47e3-9d8e-404608c14ae7-kube-api-access-4sjb2\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403337 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-sys-fs\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403362 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvqxg\" (UniqueName: \"kubernetes.io/projected/991cf6e5-5904-49e9-9916-a5c057fd14e8-kube-api-access-vvqxg\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403421 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-sysctl-d\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403451 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e1300a99-4d7a-47e3-9d8e-404608c14ae7-multus-daemon-config\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403481 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-run-ovn-kubernetes\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403505 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-systemd\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403530 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-etc-kubernetes\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403556 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-systemd-units\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403595 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-log-socket\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403622 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a615bb3-b76b-4c92-9085-2d164914c2aa-env-overrides\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403650 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-socket-dir\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403709 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d14a25b-4a6f-4f1f-9754-0a6038be8b58-host-slash\") pod \"iptables-alerter-vm9vr\" (UID: \"6d14a25b-4a6f-4f1f-9754-0a6038be8b58\") " pod="openshift-network-operator/iptables-alerter-vm9vr" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403735 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-var-lib-kubelet\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403760 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-host\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403785 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-multus-cni-dir\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.404263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403815 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw775\" (UniqueName: \"kubernetes.io/projected/6a615bb3-b76b-4c92-9085-2d164914c2aa-kube-api-access-sw775\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.405000 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403842 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-etc-selinux\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.405000 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403868 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-run\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.405000 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403936 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs\") pod \"network-metrics-daemon-sjjgw\" (UID: \"642f0536-3a1b-4d5c-bb3d-e7128392b218\") " pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:30:51.405000 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.403963 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-os-release\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.446651 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.446619 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:25:50 +0000 UTC" deadline="2027-12-25 15:18:29.447886166 +0000 UTC" Apr 16 18:30:51.446651 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.446651 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14828h47m38.001238477s" Apr 16 18:30:51.492881 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.492847 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:30:51.505084 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505054 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-kubelet\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.505280 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505091 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-run-systemd\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.505280 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505118 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.505280 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505155 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-sysconfig\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.505280 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505178 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccd57\" (UniqueName: \"kubernetes.io/projected/642f0536-3a1b-4d5c-bb3d-e7128392b218-kube-api-access-ccd57\") pod \"network-metrics-daemon-sjjgw\" (UID: \"642f0536-3a1b-4d5c-bb3d-e7128392b218\") " pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:30:51.505280 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505203 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.505280 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505202 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-kubelet\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.505280 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505227 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-cnibin\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.505280 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505202 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-run-systemd\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.505280 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505225 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.505280 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505257 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-sysconfig\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505291 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-cnibin\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505341 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-multus-socket-dir-parent\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505370 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-node-log\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505398 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505422 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3622caf8-9f1b-49f3-8219-6df05c25252f-host\") pod \"node-ca-jz4gk\" (UID: \"3622caf8-9f1b-49f3-8219-6df05c25252f\") " pod="openshift-image-registry/node-ca-jz4gk" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505447 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505456 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-multus-socket-dir-parent\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505490 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505473 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-system-cni-dir\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505510 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3622caf8-9f1b-49f3-8219-6df05c25252f-host\") pod \"node-ca-jz4gk\" (UID: \"3622caf8-9f1b-49f3-8219-6df05c25252f\") " pod="openshift-image-registry/node-ca-jz4gk" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505535 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-system-cni-dir\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505539 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-run-k8s-cni-cncf-io\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505569 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-run-k8s-cni-cncf-io\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505578 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-hostroot\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505573 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-node-log\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505604 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-cni-bin\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505609 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-hostroot\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.505704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505636 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-cni-bin\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505637 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-modprobe-d\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505663 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/266cf30f-494c-4756-9005-740f5b60c795-etc-tuned\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505688 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/266cf30f-494c-4756-9005-740f5b60c795-tmp\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505713 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-var-lib-openvswitch\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505739 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a615bb3-b76b-4c92-9085-2d164914c2aa-ovnkube-config\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505762 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qsw4\" (UniqueName: \"kubernetes.io/projected/3622caf8-9f1b-49f3-8219-6df05c25252f-kube-api-access-8qsw4\") pod \"node-ca-jz4gk\" (UID: \"3622caf8-9f1b-49f3-8219-6df05c25252f\") " pod="openshift-image-registry/node-ca-jz4gk" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505786 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-var-lib-cni-multus\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505807 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-multus-conf-dir\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505808 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-var-lib-openvswitch\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505805 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-modprobe-d\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505823 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-system-cni-dir\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505858 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-system-cni-dir\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505889 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-multus-conf-dir\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505925 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-var-lib-cni-multus\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.505989 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-var-lib-kubelet\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506024 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-registration-dir\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506031 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-var-lib-kubelet\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.506498 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506034 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506052 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3622caf8-9f1b-49f3-8219-6df05c25252f-serviceca\") pod \"node-ca-jz4gk\" (UID: \"3622caf8-9f1b-49f3-8219-6df05c25252f\") " pod="openshift-image-registry/node-ca-jz4gk" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506084 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-sysctl-conf\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506098 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-registration-dir\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506164 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-run-netns\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506106 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-run-netns\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506208 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sjb2\" (UniqueName: \"kubernetes.io/projected/e1300a99-4d7a-47e3-9d8e-404608c14ae7-kube-api-access-4sjb2\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506238 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-sys-fs\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506249 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-sysctl-conf\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506265 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvqxg\" (UniqueName: \"kubernetes.io/projected/991cf6e5-5904-49e9-9916-a5c057fd14e8-kube-api-access-vvqxg\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506291 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-sysctl-d\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506317 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e1300a99-4d7a-47e3-9d8e-404608c14ae7-multus-daemon-config\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506343 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-run-ovn-kubernetes\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506366 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-systemd\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506391 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-etc-kubernetes\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506433 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-systemd-units\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506458 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-log-socket\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.507337 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506484 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a615bb3-b76b-4c92-9085-2d164914c2aa-env-overrides\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506509 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-socket-dir\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506536 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d14a25b-4a6f-4f1f-9754-0a6038be8b58-host-slash\") pod \"iptables-alerter-vm9vr\" (UID: \"6d14a25b-4a6f-4f1f-9754-0a6038be8b58\") " pod="openshift-network-operator/iptables-alerter-vm9vr" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506537 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506550 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3622caf8-9f1b-49f3-8219-6df05c25252f-serviceca\") pod \"node-ca-jz4gk\" (UID: \"3622caf8-9f1b-49f3-8219-6df05c25252f\") " pod="openshift-image-registry/node-ca-jz4gk" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506559 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-var-lib-kubelet\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506584 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-host\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506609 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a615bb3-b76b-4c92-9085-2d164914c2aa-ovnkube-config\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506619 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-systemd\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506640 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-host\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506671 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-sys-fs\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506707 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-var-lib-kubelet\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506788 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-run-ovn-kubernetes\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506839 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-systemd-units\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506880 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-etc-kubernetes\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506923 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-log-socket\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.506962 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-multus-cni-dir\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507019 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-socket-dir\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.508086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507033 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-multus-cni-dir\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507068 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507077 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d14a25b-4a6f-4f1f-9754-0a6038be8b58-host-slash\") pod \"iptables-alerter-vm9vr\" (UID: \"6d14a25b-4a6f-4f1f-9754-0a6038be8b58\") " pod="openshift-network-operator/iptables-alerter-vm9vr" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507071 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-sysctl-d\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507096 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sw775\" (UniqueName: \"kubernetes.io/projected/6a615bb3-b76b-4c92-9085-2d164914c2aa-kube-api-access-sw775\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507133 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-etc-selinux\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507176 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-run\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507200 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs\") pod \"network-metrics-daemon-sjjgw\" (UID: \"642f0536-3a1b-4d5c-bb3d-e7128392b218\") " pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507215 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e1300a99-4d7a-47e3-9d8e-404608c14ae7-multus-daemon-config\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507225 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-os-release\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507256 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e25c8650-bbd6-4127-a3ea-3a79b45748b6-hosts-file\") pod \"node-resolver-sxsqt\" (UID: \"e25c8650-bbd6-4127-a3ea-3a79b45748b6\") " pod="openshift-dns/node-resolver-sxsqt" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507282 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-run\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507290 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc8tf\" (UniqueName: \"kubernetes.io/projected/e25c8650-bbd6-4127-a3ea-3a79b45748b6-kube-api-access-zc8tf\") pod \"node-resolver-sxsqt\" (UID: \"e25c8650-bbd6-4127-a3ea-3a79b45748b6\") " pod="openshift-dns/node-resolver-sxsqt" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507318 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-run-netns\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507343 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6a615bb3-b76b-4c92-9085-2d164914c2aa-ovnkube-script-lib\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507369 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-etc-selinux\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507380 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-os-release\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.508917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507401 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/15291a43-2092-47ea-b8ab-a7363155516e-agent-certs\") pod \"konnectivity-agent-22wdj\" (UID: \"15291a43-2092-47ea-b8ab-a7363155516e\") " pod="kube-system/konnectivity-agent-22wdj" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507418 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-run-netns\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507451 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-cnibin\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507482 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507506 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvr9m\" (UniqueName: \"kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m\") pod \"network-check-target-2bwhf\" (UID: \"ba95d7cd-292c-41ec-8417-d3768d65716d\") " pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507531 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5pvq\" (UniqueName: \"kubernetes.io/projected/6d14a25b-4a6f-4f1f-9754-0a6038be8b58-kube-api-access-z5pvq\") pod \"iptables-alerter-vm9vr\" (UID: \"6d14a25b-4a6f-4f1f-9754-0a6038be8b58\") " pod="openshift-network-operator/iptables-alerter-vm9vr" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507556 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-os-release\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:51.507574 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507580 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6d14a25b-4a6f-4f1f-9754-0a6038be8b58-iptables-alerter-script\") pod \"iptables-alerter-vm9vr\" (UID: \"6d14a25b-4a6f-4f1f-9754-0a6038be8b58\") " pod="openshift-network-operator/iptables-alerter-vm9vr" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507605 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-sys\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507631 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjrtg\" (UniqueName: \"kubernetes.io/projected/266cf30f-494c-4756-9005-740f5b60c795-kube-api-access-kjrtg\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507658 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rzrc\" (UniqueName: \"kubernetes.io/projected/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-kube-api-access-8rzrc\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507686 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-run-multus-certs\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507714 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-etc-openvswitch\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507741 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-cni-netd\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507744 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a615bb3-b76b-4c92-9085-2d164914c2aa-env-overrides\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507765 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-kubernetes\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507779 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-sys\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.509704 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507801 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-run-multus-certs\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507790 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e1300a99-4d7a-47e3-9d8e-404608c14ae7-cni-binary-copy\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507826 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-cnibin\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507843 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-var-lib-cni-bin\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507862 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6a615bb3-b76b-4c92-9085-2d164914c2aa-ovnkube-script-lib\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507875 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e25c8650-bbd6-4127-a3ea-3a79b45748b6-tmp-dir\") pod \"node-resolver-sxsqt\" (UID: \"e25c8650-bbd6-4127-a3ea-3a79b45748b6\") " pod="openshift-dns/node-resolver-sxsqt" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507905 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-run-openvswitch\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507933 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-run-ovn\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507936 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-os-release\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507964 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-device-dir\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.507989 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/15291a43-2092-47ea-b8ab-a7363155516e-konnectivity-ca\") pod \"konnectivity-agent-22wdj\" (UID: \"15291a43-2092-47ea-b8ab-a7363155516e\") " pod="kube-system/konnectivity-agent-22wdj" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508015 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-slash\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508051 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a615bb3-b76b-4c92-9085-2d164914c2aa-ovn-node-metrics-cert\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508076 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-lib-modules\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508103 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-cni-binary-copy\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508104 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1300a99-4d7a-47e3-9d8e-404608c14ae7-host-var-lib-cni-bin\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508235 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-cni-netd\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.510555 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508257 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e1300a99-4d7a-47e3-9d8e-404608c14ae7-cni-binary-copy\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508276 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-run-ovn\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508311 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-etc-openvswitch\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508313 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-etc-kubernetes\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508356 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-run-openvswitch\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508382 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/991cf6e5-5904-49e9-9916-a5c057fd14e8-device-dir\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508481 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/266cf30f-494c-4756-9005-740f5b60c795-lib-modules\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508523 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6d14a25b-4a6f-4f1f-9754-0a6038be8b58-iptables-alerter-script\") pod \"iptables-alerter-vm9vr\" (UID: \"6d14a25b-4a6f-4f1f-9754-0a6038be8b58\") " pod="openshift-network-operator/iptables-alerter-vm9vr" Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508583 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a615bb3-b76b-4c92-9085-2d164914c2aa-host-slash\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:51.508624 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs podName:642f0536-3a1b-4d5c-bb3d-e7128392b218 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:52.008573711 +0000 UTC m=+3.097837299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs") pod "network-metrics-daemon-sjjgw" (UID: "642f0536-3a1b-4d5c-bb3d-e7128392b218") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508652 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.508815 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-cni-binary-copy\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.509304 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/15291a43-2092-47ea-b8ab-a7363155516e-konnectivity-ca\") pod \"konnectivity-agent-22wdj\" (UID: \"15291a43-2092-47ea-b8ab-a7363155516e\") " pod="kube-system/konnectivity-agent-22wdj" Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.509829 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/266cf30f-494c-4756-9005-740f5b60c795-tmp\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.509887 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/266cf30f-494c-4756-9005-740f5b60c795-etc-tuned\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.510656 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a615bb3-b76b-4c92-9085-2d164914c2aa-ovn-node-metrics-cert\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.511414 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.510697 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/15291a43-2092-47ea-b8ab-a7363155516e-agent-certs\") pod \"konnectivity-agent-22wdj\" (UID: \"15291a43-2092-47ea-b8ab-a7363155516e\") " pod="kube-system/konnectivity-agent-22wdj" Apr 16 18:30:51.513759 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:51.513345 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:51.513759 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:51.513378 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:51.513759 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:51.513393 2580 projected.go:194] Error preparing data for projected volume kube-api-access-cvr9m for pod openshift-network-diagnostics/network-check-target-2bwhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:51.513759 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:51.513462 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m podName:ba95d7cd-292c-41ec-8417-d3768d65716d nodeName:}" failed. No retries permitted until 2026-04-16 18:30:52.013443535 +0000 UTC m=+3.102707125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cvr9m" (UniqueName: "kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m") pod "network-check-target-2bwhf" (UID: "ba95d7cd-292c-41ec-8417-d3768d65716d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:51.515128 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.515102 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sjb2\" (UniqueName: \"kubernetes.io/projected/e1300a99-4d7a-47e3-9d8e-404608c14ae7-kube-api-access-4sjb2\") pod \"multus-ncxt7\" (UID: \"e1300a99-4d7a-47e3-9d8e-404608c14ae7\") " pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.515623 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.515600 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccd57\" (UniqueName: \"kubernetes.io/projected/642f0536-3a1b-4d5c-bb3d-e7128392b218-kube-api-access-ccd57\") pod \"network-metrics-daemon-sjjgw\" (UID: \"642f0536-3a1b-4d5c-bb3d-e7128392b218\") " pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:30:51.515623 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.515617 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvqxg\" (UniqueName: \"kubernetes.io/projected/991cf6e5-5904-49e9-9916-a5c057fd14e8-kube-api-access-vvqxg\") pod \"aws-ebs-csi-driver-node-x8wcw\" (UID: \"991cf6e5-5904-49e9-9916-a5c057fd14e8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.515758 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.515736 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qsw4\" (UniqueName: \"kubernetes.io/projected/3622caf8-9f1b-49f3-8219-6df05c25252f-kube-api-access-8qsw4\") pod \"node-ca-jz4gk\" (UID: \"3622caf8-9f1b-49f3-8219-6df05c25252f\") " pod="openshift-image-registry/node-ca-jz4gk" Apr 16 18:30:51.516285 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.516263 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw775\" (UniqueName: \"kubernetes.io/projected/6a615bb3-b76b-4c92-9085-2d164914c2aa-kube-api-access-sw775\") pod \"ovnkube-node-6nzqh\" (UID: \"6a615bb3-b76b-4c92-9085-2d164914c2aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.519898 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.519870 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rzrc\" (UniqueName: \"kubernetes.io/projected/c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9-kube-api-access-8rzrc\") pod \"multus-additional-cni-plugins-zrvtq\" (UID: \"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9\") " pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.520001 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.519977 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjrtg\" (UniqueName: \"kubernetes.io/projected/266cf30f-494c-4756-9005-740f5b60c795-kube-api-access-kjrtg\") pod \"tuned-sp2hq\" (UID: \"266cf30f-494c-4756-9005-740f5b60c795\") " pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:51.523879 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.523859 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5pvq\" (UniqueName: \"kubernetes.io/projected/6d14a25b-4a6f-4f1f-9754-0a6038be8b58-kube-api-access-z5pvq\") pod \"iptables-alerter-vm9vr\" (UID: \"6d14a25b-4a6f-4f1f-9754-0a6038be8b58\") " pod="openshift-network-operator/iptables-alerter-vm9vr" Apr 16 18:30:51.538917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.538871 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal" event={"ID":"6609aa2fa347a80cd94ed61d2e93259b","Type":"ContainerStarted","Data":"393e73586ead693aba5c82fba16cbe2c5f48ea4c4d321235ee89123eb4866fab"} Apr 16 18:30:51.540028 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.540009 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-146.ec2.internal" event={"ID":"c7acda094cc50b1e38ea78cc83b01fd2","Type":"ContainerStarted","Data":"33854908cb6399dfd4bc5b25a591e402feee1865640bbbd93deb50b2fe66be2c"} Apr 16 18:30:51.609161 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.609116 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e25c8650-bbd6-4127-a3ea-3a79b45748b6-hosts-file\") pod \"node-resolver-sxsqt\" (UID: \"e25c8650-bbd6-4127-a3ea-3a79b45748b6\") " pod="openshift-dns/node-resolver-sxsqt" Apr 16 18:30:51.609331 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.609174 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zc8tf\" (UniqueName: \"kubernetes.io/projected/e25c8650-bbd6-4127-a3ea-3a79b45748b6-kube-api-access-zc8tf\") pod \"node-resolver-sxsqt\" (UID: \"e25c8650-bbd6-4127-a3ea-3a79b45748b6\") " pod="openshift-dns/node-resolver-sxsqt" Apr 16 18:30:51.609331 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.609221 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e25c8650-bbd6-4127-a3ea-3a79b45748b6-tmp-dir\") pod \"node-resolver-sxsqt\" (UID: \"e25c8650-bbd6-4127-a3ea-3a79b45748b6\") " pod="openshift-dns/node-resolver-sxsqt" Apr 16 18:30:51.609331 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.609301 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e25c8650-bbd6-4127-a3ea-3a79b45748b6-hosts-file\") pod \"node-resolver-sxsqt\" (UID: \"e25c8650-bbd6-4127-a3ea-3a79b45748b6\") " pod="openshift-dns/node-resolver-sxsqt" Apr 16 18:30:51.609574 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.609546 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e25c8650-bbd6-4127-a3ea-3a79b45748b6-tmp-dir\") pod \"node-resolver-sxsqt\" (UID: \"e25c8650-bbd6-4127-a3ea-3a79b45748b6\") " pod="openshift-dns/node-resolver-sxsqt" Apr 16 18:30:51.628563 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.628480 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc8tf\" (UniqueName: \"kubernetes.io/projected/e25c8650-bbd6-4127-a3ea-3a79b45748b6-kube-api-access-zc8tf\") pod \"node-resolver-sxsqt\" (UID: \"e25c8650-bbd6-4127-a3ea-3a79b45748b6\") " pod="openshift-dns/node-resolver-sxsqt" Apr 16 18:30:51.692801 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.692765 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:51.695837 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.695809 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-22wdj" Apr 16 18:30:51.706615 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.706586 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ncxt7" Apr 16 18:30:51.711540 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.711517 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:51.714761 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.714741 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" Apr 16 18:30:51.721457 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.721425 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jz4gk" Apr 16 18:30:51.729130 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.729102 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zrvtq" Apr 16 18:30:51.735859 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.735835 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vm9vr" Apr 16 18:30:51.743732 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.743686 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:30:51.752500 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.752473 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sxsqt" Apr 16 18:30:51.757080 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:51.757054 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" Apr 16 18:30:52.011163 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.011116 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs\") pod \"network-metrics-daemon-sjjgw\" (UID: \"642f0536-3a1b-4d5c-bb3d-e7128392b218\") " pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:30:52.011339 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:52.011271 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:52.011404 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:52.011343 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs podName:642f0536-3a1b-4d5c-bb3d-e7128392b218 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:53.011327814 +0000 UTC m=+4.100591396 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs") pod "network-metrics-daemon-sjjgw" (UID: "642f0536-3a1b-4d5c-bb3d-e7128392b218") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:52.111577 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.111543 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvr9m\" (UniqueName: \"kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m\") pod \"network-check-target-2bwhf\" (UID: \"ba95d7cd-292c-41ec-8417-d3768d65716d\") " pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:30:52.111741 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:52.111703 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:52.111741 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:52.111724 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:52.111741 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:52.111741 2580 projected.go:194] Error preparing data for projected volume kube-api-access-cvr9m for pod openshift-network-diagnostics/network-check-target-2bwhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:52.111898 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:52.111807 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m podName:ba95d7cd-292c-41ec-8417-d3768d65716d nodeName:}" failed. No retries permitted until 2026-04-16 18:30:53.111787963 +0000 UTC m=+4.201051538 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cvr9m" (UniqueName: "kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m") pod "network-check-target-2bwhf" (UID: "ba95d7cd-292c-41ec-8417-d3768d65716d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:52.300194 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:52.300166 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3622caf8_9f1b_49f3_8219_6df05c25252f.slice/crio-dcb9b97133a21728b54f00b9df0ec4c721c57a4d3d298a12908ade83bc4aa5c3 WatchSource:0}: Error finding container dcb9b97133a21728b54f00b9df0ec4c721c57a4d3d298a12908ade83bc4aa5c3: Status 404 returned error can't find the container with id dcb9b97133a21728b54f00b9df0ec4c721c57a4d3d298a12908ade83bc4aa5c3 Apr 16 18:30:52.301122 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:52.300993 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15291a43_2092_47ea_b8ab_a7363155516e.slice/crio-2411385dd12bd30cd48cbefad8dc95012fe474041b32b63f3e4708cb87f263da WatchSource:0}: Error finding container 2411385dd12bd30cd48cbefad8dc95012fe474041b32b63f3e4708cb87f263da: Status 404 returned error can't find the container with id 2411385dd12bd30cd48cbefad8dc95012fe474041b32b63f3e4708cb87f263da Apr 16 18:30:52.304029 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:52.304001 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7fce7b3_ec9b_49f6_8e70_8a22ed2f44c9.slice/crio-c69ae126ee4fb6981c56ef64767fa2c6febed62a37f08ceb07703450957011b6 WatchSource:0}: Error finding container c69ae126ee4fb6981c56ef64767fa2c6febed62a37f08ceb07703450957011b6: Status 404 returned error can't find the container with id c69ae126ee4fb6981c56ef64767fa2c6febed62a37f08ceb07703450957011b6 Apr 16 18:30:52.306559 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:52.306537 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod991cf6e5_5904_49e9_9916_a5c057fd14e8.slice/crio-590170a889d7cadc40e98df14923b24db396a87997e2c9da55d5ff743e49fe55 WatchSource:0}: Error finding container 590170a889d7cadc40e98df14923b24db396a87997e2c9da55d5ff743e49fe55: Status 404 returned error can't find the container with id 590170a889d7cadc40e98df14923b24db396a87997e2c9da55d5ff743e49fe55 Apr 16 18:30:52.307743 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:52.307715 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d14a25b_4a6f_4f1f_9754_0a6038be8b58.slice/crio-98bb41441fb813b8a29002ea9e8cbdb80dd92fe73d61d5cc471809b7ebcb5c69 WatchSource:0}: Error finding container 98bb41441fb813b8a29002ea9e8cbdb80dd92fe73d61d5cc471809b7ebcb5c69: Status 404 returned error can't find the container with id 98bb41441fb813b8a29002ea9e8cbdb80dd92fe73d61d5cc471809b7ebcb5c69 Apr 16 18:30:52.308689 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:52.308668 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1300a99_4d7a_47e3_9d8e_404608c14ae7.slice/crio-0c4876c07de62cff168e97df19ce62ae25f53a00c29ad3688f1063e307967171 WatchSource:0}: Error finding container 0c4876c07de62cff168e97df19ce62ae25f53a00c29ad3688f1063e307967171: Status 404 returned error can't find the container with id 0c4876c07de62cff168e97df19ce62ae25f53a00c29ad3688f1063e307967171 Apr 16 18:30:52.309840 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:52.309811 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod266cf30f_494c_4756_9005_740f5b60c795.slice/crio-b9be4e880b9f208bc40cb211070735288b572f51c998e4713618ac0fedc950f2 WatchSource:0}: Error finding container b9be4e880b9f208bc40cb211070735288b572f51c998e4713618ac0fedc950f2: Status 404 returned error can't find the container with id b9be4e880b9f208bc40cb211070735288b572f51c998e4713618ac0fedc950f2 Apr 16 18:30:52.312354 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:52.312011 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a615bb3_b76b_4c92_9085_2d164914c2aa.slice/crio-cd820ad6409df6835adb83184dbc9897874e4f3203951c4274ade60c05ca19b8 WatchSource:0}: Error finding container cd820ad6409df6835adb83184dbc9897874e4f3203951c4274ade60c05ca19b8: Status 404 returned error can't find the container with id cd820ad6409df6835adb83184dbc9897874e4f3203951c4274ade60c05ca19b8 Apr 16 18:30:52.312818 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:30:52.312799 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode25c8650_bbd6_4127_a3ea_3a79b45748b6.slice/crio-b04056bd64947bbf175b9acba6797e9f2812fa8276010da6f763093d474dcd5e WatchSource:0}: Error finding container b04056bd64947bbf175b9acba6797e9f2812fa8276010da6f763093d474dcd5e: Status 404 returned error can't find the container with id b04056bd64947bbf175b9acba6797e9f2812fa8276010da6f763093d474dcd5e Apr 16 18:30:52.447653 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.447478 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:25:50 +0000 UTC" deadline="2028-01-07 05:35:01.433511304 +0000 UTC" Apr 16 18:30:52.447653 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.447648 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15131h4m8.985866929s" Apr 16 18:30:52.535205 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.535088 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:30:52.535205 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.535088 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:30:52.535379 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:52.535230 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:30:52.535379 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:52.535331 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:30:52.541922 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.541894 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sxsqt" event={"ID":"e25c8650-bbd6-4127-a3ea-3a79b45748b6","Type":"ContainerStarted","Data":"b04056bd64947bbf175b9acba6797e9f2812fa8276010da6f763093d474dcd5e"} Apr 16 18:30:52.542889 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.542866 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ncxt7" event={"ID":"e1300a99-4d7a-47e3-9d8e-404608c14ae7","Type":"ContainerStarted","Data":"0c4876c07de62cff168e97df19ce62ae25f53a00c29ad3688f1063e307967171"} Apr 16 18:30:52.543807 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.543788 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jz4gk" event={"ID":"3622caf8-9f1b-49f3-8219-6df05c25252f","Type":"ContainerStarted","Data":"dcb9b97133a21728b54f00b9df0ec4c721c57a4d3d298a12908ade83bc4aa5c3"} Apr 16 18:30:52.544838 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.544820 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" event={"ID":"6a615bb3-b76b-4c92-9085-2d164914c2aa","Type":"ContainerStarted","Data":"cd820ad6409df6835adb83184dbc9897874e4f3203951c4274ade60c05ca19b8"} Apr 16 18:30:52.545785 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.545755 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" event={"ID":"266cf30f-494c-4756-9005-740f5b60c795","Type":"ContainerStarted","Data":"b9be4e880b9f208bc40cb211070735288b572f51c998e4713618ac0fedc950f2"} Apr 16 18:30:52.546687 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.546668 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vm9vr" event={"ID":"6d14a25b-4a6f-4f1f-9754-0a6038be8b58","Type":"ContainerStarted","Data":"98bb41441fb813b8a29002ea9e8cbdb80dd92fe73d61d5cc471809b7ebcb5c69"} Apr 16 18:30:52.547560 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.547543 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" event={"ID":"991cf6e5-5904-49e9-9916-a5c057fd14e8","Type":"ContainerStarted","Data":"590170a889d7cadc40e98df14923b24db396a87997e2c9da55d5ff743e49fe55"} Apr 16 18:30:52.548400 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.548383 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zrvtq" event={"ID":"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9","Type":"ContainerStarted","Data":"c69ae126ee4fb6981c56ef64767fa2c6febed62a37f08ceb07703450957011b6"} Apr 16 18:30:52.549368 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.549351 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-22wdj" event={"ID":"15291a43-2092-47ea-b8ab-a7363155516e","Type":"ContainerStarted","Data":"2411385dd12bd30cd48cbefad8dc95012fe474041b32b63f3e4708cb87f263da"} Apr 16 18:30:52.550708 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.550690 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-146.ec2.internal" event={"ID":"c7acda094cc50b1e38ea78cc83b01fd2","Type":"ContainerStarted","Data":"66f0e19d804a2e543df57e796f225683bb1bfe3ea74aa1a5276f0d98bd6b81cf"} Apr 16 18:30:52.564814 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:52.564780 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-146.ec2.internal" podStartSLOduration=1.564769041 podStartE2EDuration="1.564769041s" podCreationTimestamp="2026-04-16 18:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:52.564653768 +0000 UTC m=+3.653917362" watchObservedRunningTime="2026-04-16 18:30:52.564769041 +0000 UTC m=+3.654032633" Apr 16 18:30:53.019195 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:53.017662 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs\") pod \"network-metrics-daemon-sjjgw\" (UID: \"642f0536-3a1b-4d5c-bb3d-e7128392b218\") " pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:30:53.019195 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:53.017835 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:53.019195 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:53.017905 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs podName:642f0536-3a1b-4d5c-bb3d-e7128392b218 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:55.017885209 +0000 UTC m=+6.107148784 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs") pod "network-metrics-daemon-sjjgw" (UID: "642f0536-3a1b-4d5c-bb3d-e7128392b218") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:53.118882 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:53.118768 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvr9m\" (UniqueName: \"kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m\") pod \"network-check-target-2bwhf\" (UID: \"ba95d7cd-292c-41ec-8417-d3768d65716d\") " pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:30:53.119049 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:53.118946 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:53.119049 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:53.118966 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:53.119049 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:53.118979 2580 projected.go:194] Error preparing data for projected volume kube-api-access-cvr9m for pod openshift-network-diagnostics/network-check-target-2bwhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:53.119049 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:53.119037 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m podName:ba95d7cd-292c-41ec-8417-d3768d65716d nodeName:}" failed. No retries permitted until 2026-04-16 18:30:55.119018619 +0000 UTC m=+6.208282191 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cvr9m" (UniqueName: "kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m") pod "network-check-target-2bwhf" (UID: "ba95d7cd-292c-41ec-8417-d3768d65716d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:53.562684 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:53.562513 2580 generic.go:358] "Generic (PLEG): container finished" podID="6609aa2fa347a80cd94ed61d2e93259b" containerID="141c88a0ec62920944e1a8f977263dea9e5416b6b06c7a8129f8328bd349511d" exitCode=0 Apr 16 18:30:53.562684 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:53.562625 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal" event={"ID":"6609aa2fa347a80cd94ed61d2e93259b","Type":"ContainerDied","Data":"141c88a0ec62920944e1a8f977263dea9e5416b6b06c7a8129f8328bd349511d"} Apr 16 18:30:54.535575 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:54.534856 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:30:54.535575 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:54.534987 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:30:54.535575 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:54.535432 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:30:54.535575 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:54.535535 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:30:54.570325 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:54.570270 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal" event={"ID":"6609aa2fa347a80cd94ed61d2e93259b","Type":"ContainerStarted","Data":"4fe6207f86f24de70c6ce33f272b6a97dc49ce68457c2065b4a6773b49693752"} Apr 16 18:30:54.589903 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:54.589848 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-146.ec2.internal" podStartSLOduration=4.589831086 podStartE2EDuration="4.589831086s" podCreationTimestamp="2026-04-16 18:30:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:54.58899982 +0000 UTC m=+5.678263415" watchObservedRunningTime="2026-04-16 18:30:54.589831086 +0000 UTC m=+5.679094681" Apr 16 18:30:55.037817 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:55.037776 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs\") pod \"network-metrics-daemon-sjjgw\" (UID: \"642f0536-3a1b-4d5c-bb3d-e7128392b218\") " pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:30:55.038000 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:55.037927 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:55.038000 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:55.037998 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs podName:642f0536-3a1b-4d5c-bb3d-e7128392b218 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:59.037978906 +0000 UTC m=+10.127242502 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs") pod "network-metrics-daemon-sjjgw" (UID: "642f0536-3a1b-4d5c-bb3d-e7128392b218") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:55.138482 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:55.138441 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvr9m\" (UniqueName: \"kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m\") pod \"network-check-target-2bwhf\" (UID: \"ba95d7cd-292c-41ec-8417-d3768d65716d\") " pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:30:55.138666 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:55.138599 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:55.138666 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:55.138621 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:55.138666 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:55.138634 2580 projected.go:194] Error preparing data for projected volume kube-api-access-cvr9m for pod openshift-network-diagnostics/network-check-target-2bwhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:55.138819 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:55.138694 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m podName:ba95d7cd-292c-41ec-8417-d3768d65716d nodeName:}" failed. No retries permitted until 2026-04-16 18:30:59.13867369 +0000 UTC m=+10.227937265 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cvr9m" (UniqueName: "kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m") pod "network-check-target-2bwhf" (UID: "ba95d7cd-292c-41ec-8417-d3768d65716d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:56.534559 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:56.534395 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:30:56.534559 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:56.534545 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:30:56.535075 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:56.534395 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:30:56.535075 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:56.534659 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:30:58.534530 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:58.534441 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:30:58.534530 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:58.534474 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:30:58.535071 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:58.534587 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:30:58.535071 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:58.534712 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:30:59.071366 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:59.071324 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs\") pod \"network-metrics-daemon-sjjgw\" (UID: \"642f0536-3a1b-4d5c-bb3d-e7128392b218\") " pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:30:59.071554 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:59.071510 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:59.071611 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:59.071586 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs podName:642f0536-3a1b-4d5c-bb3d-e7128392b218 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:07.071563449 +0000 UTC m=+18.160827026 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs") pod "network-metrics-daemon-sjjgw" (UID: "642f0536-3a1b-4d5c-bb3d-e7128392b218") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:59.172399 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:30:59.171739 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvr9m\" (UniqueName: \"kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m\") pod \"network-check-target-2bwhf\" (UID: \"ba95d7cd-292c-41ec-8417-d3768d65716d\") " pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:30:59.172399 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:59.171911 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:59.172399 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:59.171933 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:59.172399 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:59.171945 2580 projected.go:194] Error preparing data for projected volume kube-api-access-cvr9m for pod openshift-network-diagnostics/network-check-target-2bwhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:59.172399 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:30:59.172004 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m podName:ba95d7cd-292c-41ec-8417-d3768d65716d nodeName:}" failed. No retries permitted until 2026-04-16 18:31:07.171985144 +0000 UTC m=+18.261248733 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cvr9m" (UniqueName: "kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m") pod "network-check-target-2bwhf" (UID: "ba95d7cd-292c-41ec-8417-d3768d65716d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:00.535308 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:00.535267 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:00.535785 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:00.535418 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:00.535785 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:00.535267 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:00.535785 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:00.535766 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:02.535502 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:02.535457 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:02.536062 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:02.535457 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:02.536062 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:02.535598 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:02.536062 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:02.535642 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:04.535106 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:04.535071 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:04.535106 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:04.535092 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:04.535554 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:04.535214 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:04.535554 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:04.535344 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:06.534783 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:06.534747 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:06.535198 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:06.534897 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:06.535198 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:06.534988 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:06.535198 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:06.535099 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:07.130160 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:07.130113 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs\") pod \"network-metrics-daemon-sjjgw\" (UID: \"642f0536-3a1b-4d5c-bb3d-e7128392b218\") " pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:07.130302 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:07.130267 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:07.130360 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:07.130333 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs podName:642f0536-3a1b-4d5c-bb3d-e7128392b218 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:23.130315132 +0000 UTC m=+34.219578707 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs") pod "network-metrics-daemon-sjjgw" (UID: "642f0536-3a1b-4d5c-bb3d-e7128392b218") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:07.231373 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:07.231339 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvr9m\" (UniqueName: \"kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m\") pod \"network-check-target-2bwhf\" (UID: \"ba95d7cd-292c-41ec-8417-d3768d65716d\") " pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:07.231567 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:07.231531 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:31:07.231567 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:07.231554 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:31:07.231567 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:07.231567 2580 projected.go:194] Error preparing data for projected volume kube-api-access-cvr9m for pod openshift-network-diagnostics/network-check-target-2bwhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:07.231719 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:07.231633 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m podName:ba95d7cd-292c-41ec-8417-d3768d65716d nodeName:}" failed. No retries permitted until 2026-04-16 18:31:23.231613924 +0000 UTC m=+34.320877499 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cvr9m" (UniqueName: "kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m") pod "network-check-target-2bwhf" (UID: "ba95d7cd-292c-41ec-8417-d3768d65716d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:08.534651 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:08.534621 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:08.535050 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:08.534620 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:08.535050 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:08.534756 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:08.535050 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:08.534821 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:09.601600 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:09.601320 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" event={"ID":"266cf30f-494c-4756-9005-740f5b60c795","Type":"ContainerStarted","Data":"616dc98d9b203fa6eb2d26e1eabbdbef8c41a8b424e033e7b4a4292f67388fdf"} Apr 16 18:31:09.604924 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:09.604892 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-22wdj" event={"ID":"15291a43-2092-47ea-b8ab-a7363155516e","Type":"ContainerStarted","Data":"d0e864ba16f3d5256366715562fe91ba2600288dde546e3e3e6909e80832e131"} Apr 16 18:31:09.609296 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:09.608305 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ncxt7" event={"ID":"e1300a99-4d7a-47e3-9d8e-404608c14ae7","Type":"ContainerStarted","Data":"bbf68fc46fd99fa7388c5fe69152d6802fe7a7d5b853ef2da964fe9e33354c65"} Apr 16 18:31:09.618775 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:09.618683 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-sp2hq" podStartSLOduration=3.660083782 podStartE2EDuration="20.618666217s" podCreationTimestamp="2026-04-16 18:30:49 +0000 UTC" firstStartedPulling="2026-04-16 18:30:52.312246911 +0000 UTC m=+3.401510485" lastFinishedPulling="2026-04-16 18:31:09.270829336 +0000 UTC m=+20.360092920" observedRunningTime="2026-04-16 18:31:09.617960997 +0000 UTC m=+20.707224591" watchObservedRunningTime="2026-04-16 18:31:09.618666217 +0000 UTC m=+20.707929812" Apr 16 18:31:09.648877 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:09.648828 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-22wdj" podStartSLOduration=3.681067042 podStartE2EDuration="20.648811092s" podCreationTimestamp="2026-04-16 18:30:49 +0000 UTC" firstStartedPulling="2026-04-16 18:30:52.302937226 +0000 UTC m=+3.392200810" lastFinishedPulling="2026-04-16 18:31:09.270681286 +0000 UTC m=+20.359944860" observedRunningTime="2026-04-16 18:31:09.632337487 +0000 UTC m=+20.721601080" watchObservedRunningTime="2026-04-16 18:31:09.648811092 +0000 UTC m=+20.738074685" Apr 16 18:31:10.044102 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.043940 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-22wdj" Apr 16 18:31:10.535102 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.535073 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:10.535247 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.535115 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:10.535247 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:10.535209 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:10.535342 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:10.535319 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:10.614984 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.614952 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" event={"ID":"991cf6e5-5904-49e9-9916-a5c057fd14e8","Type":"ContainerStarted","Data":"8734331be601bc47a611b248997818ee25ada6a88a1d0e2d5cc06054699eca42"} Apr 16 18:31:10.616303 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.616280 2580 generic.go:358] "Generic (PLEG): container finished" podID="c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9" containerID="6065bcfe8fa14bca2df5e50395ebc45d74a1fb4f99c7bb9a7ef5cc06dccfe256" exitCode=0 Apr 16 18:31:10.616387 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.616361 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zrvtq" event={"ID":"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9","Type":"ContainerDied","Data":"6065bcfe8fa14bca2df5e50395ebc45d74a1fb4f99c7bb9a7ef5cc06dccfe256"} Apr 16 18:31:10.617700 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.617677 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sxsqt" event={"ID":"e25c8650-bbd6-4127-a3ea-3a79b45748b6","Type":"ContainerStarted","Data":"06d12002421644e07f4b374b26e604f0b1757bec0c2fba81e5fe8541578edd7f"} Apr 16 18:31:10.618884 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.618862 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jz4gk" event={"ID":"3622caf8-9f1b-49f3-8219-6df05c25252f","Type":"ContainerStarted","Data":"e83aa5d1bc67e9493df90880bcea86d9621494e4552eb4e8eb75c90ee08d81c3"} Apr 16 18:31:10.621320 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.621284 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" event={"ID":"6a615bb3-b76b-4c92-9085-2d164914c2aa","Type":"ContainerStarted","Data":"c40ac851812481a39974fb352511f406d78c969a3a62a6881864cf590bbf27a7"} Apr 16 18:31:10.621406 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.621323 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" event={"ID":"6a615bb3-b76b-4c92-9085-2d164914c2aa","Type":"ContainerStarted","Data":"4b619bd9e27b9d5067e84c1edeeb78e8c126e8436e500f1c53142c4661fc425a"} Apr 16 18:31:10.621406 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.621339 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" event={"ID":"6a615bb3-b76b-4c92-9085-2d164914c2aa","Type":"ContainerStarted","Data":"d21bf2ba217e1c0a23629cdc3de3bb4f524f2418f6bc75179d75fb4bf5bec836"} Apr 16 18:31:10.621406 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.621351 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" event={"ID":"6a615bb3-b76b-4c92-9085-2d164914c2aa","Type":"ContainerStarted","Data":"7f841bc178fe4e38c8340dd078761bf53a549da2355a7c042d82bcd58367fc4b"} Apr 16 18:31:10.621406 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.621360 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" event={"ID":"6a615bb3-b76b-4c92-9085-2d164914c2aa","Type":"ContainerStarted","Data":"8d53a4350deca274021854da4ed819154170e9e4eef55303cb3bbe8d8aa23229"} Apr 16 18:31:10.621406 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.621368 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" event={"ID":"6a615bb3-b76b-4c92-9085-2d164914c2aa","Type":"ContainerStarted","Data":"ebec0050f54e719f851d38a215bfdd67c31dcae8c0ad37b491e8f680f783c13c"} Apr 16 18:31:10.639016 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.638975 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ncxt7" podStartSLOduration=4.619344591 podStartE2EDuration="21.638964087s" podCreationTimestamp="2026-04-16 18:30:49 +0000 UTC" firstStartedPulling="2026-04-16 18:30:52.310638048 +0000 UTC m=+3.399901621" lastFinishedPulling="2026-04-16 18:31:09.330257545 +0000 UTC m=+20.419521117" observedRunningTime="2026-04-16 18:31:09.656531167 +0000 UTC m=+20.745794767" watchObservedRunningTime="2026-04-16 18:31:10.638964087 +0000 UTC m=+21.728227681" Apr 16 18:31:10.652001 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.651956 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jz4gk" podStartSLOduration=4.683780954 podStartE2EDuration="21.651945613s" podCreationTimestamp="2026-04-16 18:30:49 +0000 UTC" firstStartedPulling="2026-04-16 18:30:52.302515704 +0000 UTC m=+3.391779280" lastFinishedPulling="2026-04-16 18:31:09.270680357 +0000 UTC m=+20.359943939" observedRunningTime="2026-04-16 18:31:10.651912264 +0000 UTC m=+21.741175858" watchObservedRunningTime="2026-04-16 18:31:10.651945613 +0000 UTC m=+21.741209206" Apr 16 18:31:10.666852 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.666812 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sxsqt" podStartSLOduration=4.65925352 podStartE2EDuration="21.666803529s" podCreationTimestamp="2026-04-16 18:30:49 +0000 UTC" firstStartedPulling="2026-04-16 18:30:52.314830034 +0000 UTC m=+3.404093606" lastFinishedPulling="2026-04-16 18:31:09.322380042 +0000 UTC m=+20.411643615" observedRunningTime="2026-04-16 18:31:10.666758511 +0000 UTC m=+21.756022105" watchObservedRunningTime="2026-04-16 18:31:10.666803529 +0000 UTC m=+21.756067121" Apr 16 18:31:10.992913 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:10.992889 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:31:11.467773 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:11.467674 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:31:10.992910181Z","UUID":"6ae31b1e-0445-4d00-840d-43bde810697d","Handler":null,"Name":"","Endpoint":""} Apr 16 18:31:11.469982 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:11.469711 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:31:11.469982 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:11.469743 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:31:11.624614 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:11.624502 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vm9vr" event={"ID":"6d14a25b-4a6f-4f1f-9754-0a6038be8b58","Type":"ContainerStarted","Data":"9b93b3a3f60d3b6597b97e9b69f2f61cf8823186a050de9662f1805300d61879"} Apr 16 18:31:11.626565 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:11.626532 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" event={"ID":"991cf6e5-5904-49e9-9916-a5c057fd14e8","Type":"ContainerStarted","Data":"2f343232a71c72c1918dc6eaffcb03d827571525978be8bd5ff2b7b04055b23f"} Apr 16 18:31:11.646724 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:11.646673 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vm9vr" podStartSLOduration=5.661542218 podStartE2EDuration="22.646656625s" podCreationTimestamp="2026-04-16 18:30:49 +0000 UTC" firstStartedPulling="2026-04-16 18:30:52.309920365 +0000 UTC m=+3.399183940" lastFinishedPulling="2026-04-16 18:31:09.295034776 +0000 UTC m=+20.384298347" observedRunningTime="2026-04-16 18:31:11.646159462 +0000 UTC m=+22.735423055" watchObservedRunningTime="2026-04-16 18:31:11.646656625 +0000 UTC m=+22.735920220" Apr 16 18:31:12.535566 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:12.535385 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:12.535765 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:12.535397 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:12.535765 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:12.535652 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:12.535882 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:12.535766 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:12.631318 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:12.631285 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" event={"ID":"6a615bb3-b76b-4c92-9085-2d164914c2aa","Type":"ContainerStarted","Data":"984fd810bf735b5adaef1aa545deee1a626bae9c86d701354437de88629861be"} Apr 16 18:31:12.633207 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:12.633164 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" event={"ID":"991cf6e5-5904-49e9-9916-a5c057fd14e8","Type":"ContainerStarted","Data":"d92ee0528cb817f98c285096ed7b0c6ec10ed5e8437752450ab03063e9b4d151"} Apr 16 18:31:12.657308 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:12.657265 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x8wcw" podStartSLOduration=3.7730582459999997 podStartE2EDuration="23.657250062s" podCreationTimestamp="2026-04-16 18:30:49 +0000 UTC" firstStartedPulling="2026-04-16 18:30:52.308756693 +0000 UTC m=+3.398020268" lastFinishedPulling="2026-04-16 18:31:12.192948507 +0000 UTC m=+23.282212084" observedRunningTime="2026-04-16 18:31:12.656944511 +0000 UTC m=+23.746208103" watchObservedRunningTime="2026-04-16 18:31:12.657250062 +0000 UTC m=+23.746513654" Apr 16 18:31:13.112328 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:13.112290 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-22wdj" Apr 16 18:31:13.112995 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:13.112973 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-22wdj" Apr 16 18:31:13.636442 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:13.636412 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-22wdj" Apr 16 18:31:14.534727 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:14.534685 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:14.534850 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:14.534690 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:14.534850 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:14.534793 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:14.535007 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:14.534900 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:14.641252 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:14.641032 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" event={"ID":"6a615bb3-b76b-4c92-9085-2d164914c2aa","Type":"ContainerStarted","Data":"48807024c51f5cc7ac84380fc8448f3901dc1855b4c41d0be91b512238570758"} Apr 16 18:31:14.641948 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:14.641406 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:31:14.656764 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:14.656736 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:31:14.669513 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:14.669458 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" podStartSLOduration=8.38317045 podStartE2EDuration="25.669439115s" podCreationTimestamp="2026-04-16 18:30:49 +0000 UTC" firstStartedPulling="2026-04-16 18:30:52.31408029 +0000 UTC m=+3.403343871" lastFinishedPulling="2026-04-16 18:31:09.600348965 +0000 UTC m=+20.689612536" observedRunningTime="2026-04-16 18:31:14.667640002 +0000 UTC m=+25.756903596" watchObservedRunningTime="2026-04-16 18:31:14.669439115 +0000 UTC m=+25.758702710" Apr 16 18:31:15.644255 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:15.644213 2580 generic.go:358] "Generic (PLEG): container finished" podID="c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9" containerID="8df4a71f601e22c59d76c266add9b31b25b1c98275738021c4bdc11b18cec94d" exitCode=0 Apr 16 18:31:15.644621 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:15.644289 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zrvtq" event={"ID":"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9","Type":"ContainerDied","Data":"8df4a71f601e22c59d76c266add9b31b25b1c98275738021c4bdc11b18cec94d"} Apr 16 18:31:15.645353 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:15.644834 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:31:15.645353 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:15.644856 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:31:15.659263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:15.659235 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:31:16.534653 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:16.534570 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:16.534791 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:16.534570 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:16.534791 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:16.534680 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:16.534791 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:16.534747 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:16.648206 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:16.648161 2580 generic.go:358] "Generic (PLEG): container finished" podID="c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9" containerID="57faad0b8d0392408e30e0fd19fc2a9a5f5ea5707a05366de78d31159eae4a36" exitCode=0 Apr 16 18:31:16.648589 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:16.648220 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zrvtq" event={"ID":"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9","Type":"ContainerDied","Data":"57faad0b8d0392408e30e0fd19fc2a9a5f5ea5707a05366de78d31159eae4a36"} Apr 16 18:31:17.652354 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:17.652254 2580 generic.go:358] "Generic (PLEG): container finished" podID="c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9" containerID="3a82f1a5f339f7597aecd5d8f7117eebc7577f58914c09d46e93276d78a76503" exitCode=0 Apr 16 18:31:17.652354 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:17.652338 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zrvtq" event={"ID":"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9","Type":"ContainerDied","Data":"3a82f1a5f339f7597aecd5d8f7117eebc7577f58914c09d46e93276d78a76503"} Apr 16 18:31:18.535270 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:18.535226 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:18.535468 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:18.535356 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:18.535468 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:18.535408 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:18.535576 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:18.535538 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:20.535175 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:20.535128 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:20.535679 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:20.535129 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:20.535679 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:20.535267 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:20.535679 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:20.535336 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:22.535336 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:22.535300 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:22.535336 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:22.535335 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:22.535920 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:22.535429 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:22.535920 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:22.535520 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:23.142431 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:23.142404 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs\") pod \"network-metrics-daemon-sjjgw\" (UID: \"642f0536-3a1b-4d5c-bb3d-e7128392b218\") " pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:23.142596 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:23.142516 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:23.142596 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:23.142565 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs podName:642f0536-3a1b-4d5c-bb3d-e7128392b218 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:55.14255109 +0000 UTC m=+66.231814661 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs") pod "network-metrics-daemon-sjjgw" (UID: "642f0536-3a1b-4d5c-bb3d-e7128392b218") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:23.242977 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:23.242947 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvr9m\" (UniqueName: \"kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m\") pod \"network-check-target-2bwhf\" (UID: \"ba95d7cd-292c-41ec-8417-d3768d65716d\") " pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:23.243113 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:23.243092 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:31:23.243113 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:23.243114 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:31:23.243294 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:23.243124 2580 projected.go:194] Error preparing data for projected volume kube-api-access-cvr9m for pod openshift-network-diagnostics/network-check-target-2bwhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:23.243294 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:23.243192 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m podName:ba95d7cd-292c-41ec-8417-d3768d65716d nodeName:}" failed. No retries permitted until 2026-04-16 18:31:55.243171684 +0000 UTC m=+66.332435261 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cvr9m" (UniqueName: "kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m") pod "network-check-target-2bwhf" (UID: "ba95d7cd-292c-41ec-8417-d3768d65716d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:23.665182 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:23.665132 2580 generic.go:358] "Generic (PLEG): container finished" podID="c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9" containerID="c5e03146e61a5157e3b70a61a760d36c1856ac396199c5be98641c14f2c8e486" exitCode=0 Apr 16 18:31:23.665594 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:23.665194 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zrvtq" event={"ID":"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9","Type":"ContainerDied","Data":"c5e03146e61a5157e3b70a61a760d36c1856ac396199c5be98641c14f2c8e486"} Apr 16 18:31:24.534734 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:24.534693 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:24.534933 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:24.534837 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:24.534933 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:24.534875 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:24.535059 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:24.534952 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:24.669692 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:24.669656 2580 generic.go:358] "Generic (PLEG): container finished" podID="c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9" containerID="1aff3a6be7283e6efe04fd684bbca75f2e55cef5607a51847d42fe2501992e1b" exitCode=0 Apr 16 18:31:24.670192 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:24.669703 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zrvtq" event={"ID":"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9","Type":"ContainerDied","Data":"1aff3a6be7283e6efe04fd684bbca75f2e55cef5607a51847d42fe2501992e1b"} Apr 16 18:31:25.674700 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:25.674667 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zrvtq" event={"ID":"c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9","Type":"ContainerStarted","Data":"93b65060b1017486b77e94b4223cc6fdf90a3ac25fc342f14de06cefaebcbbd1"} Apr 16 18:31:25.700346 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:25.700304 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zrvtq" podStartSLOduration=5.784545813 podStartE2EDuration="36.700290723s" podCreationTimestamp="2026-04-16 18:30:49 +0000 UTC" firstStartedPulling="2026-04-16 18:30:52.306801876 +0000 UTC m=+3.396065447" lastFinishedPulling="2026-04-16 18:31:23.222546785 +0000 UTC m=+34.311810357" observedRunningTime="2026-04-16 18:31:25.698419689 +0000 UTC m=+36.787683282" watchObservedRunningTime="2026-04-16 18:31:25.700290723 +0000 UTC m=+36.789554315" Apr 16 18:31:26.535192 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:26.535140 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:26.535397 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:26.535140 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:26.535397 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:26.535259 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:26.535397 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:26.535352 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:28.535160 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:28.535114 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:28.535554 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:28.535113 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:28.535554 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:28.535236 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:28.535554 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:28.535289 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:30.535178 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:30.535127 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:30.535557 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:30.535127 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:30.535557 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:30.535248 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:30.535557 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:30.535312 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:32.535379 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:32.535329 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:32.535855 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:32.535329 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:32.535855 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:32.535466 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:32.535855 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:32.535513 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:33.460099 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:33.459918 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sjjgw"] Apr 16 18:31:33.460296 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:33.460216 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:33.460358 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:33.460328 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:33.462708 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:33.462678 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2bwhf"] Apr 16 18:31:33.462838 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:33.462782 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:33.462909 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:33.462887 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:34.534890 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:34.534860 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:34.535438 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:34.534860 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:34.535438 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:34.534972 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:34.535438 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:34.535039 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:34.677098 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:34.677066 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-989rq"] Apr 16 18:31:34.689093 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:34.689073 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:34.689221 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:34.689136 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-989rq" podUID="d2366368-d34f-4493-b764-4aa4105b1922" Apr 16 18:31:34.698293 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:34.698264 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-989rq"] Apr 16 18:31:34.698430 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:34.698333 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:34.698430 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:34.698403 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-989rq" podUID="d2366368-d34f-4493-b764-4aa4105b1922" Apr 16 18:31:34.824973 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:34.824899 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d2366368-d34f-4493-b764-4aa4105b1922-kubelet-config\") pod \"global-pull-secret-syncer-989rq\" (UID: \"d2366368-d34f-4493-b764-4aa4105b1922\") " pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:34.824973 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:34.824941 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d2366368-d34f-4493-b764-4aa4105b1922-dbus\") pod \"global-pull-secret-syncer-989rq\" (UID: \"d2366368-d34f-4493-b764-4aa4105b1922\") " pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:34.824973 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:34.824969 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d2366368-d34f-4493-b764-4aa4105b1922-original-pull-secret\") pod \"global-pull-secret-syncer-989rq\" (UID: \"d2366368-d34f-4493-b764-4aa4105b1922\") " pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:34.925227 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:34.925195 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d2366368-d34f-4493-b764-4aa4105b1922-kubelet-config\") pod \"global-pull-secret-syncer-989rq\" (UID: \"d2366368-d34f-4493-b764-4aa4105b1922\") " pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:34.925227 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:34.925231 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d2366368-d34f-4493-b764-4aa4105b1922-dbus\") pod \"global-pull-secret-syncer-989rq\" (UID: \"d2366368-d34f-4493-b764-4aa4105b1922\") " pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:34.925415 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:34.925252 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d2366368-d34f-4493-b764-4aa4105b1922-original-pull-secret\") pod \"global-pull-secret-syncer-989rq\" (UID: \"d2366368-d34f-4493-b764-4aa4105b1922\") " pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:34.925415 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:34.925328 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d2366368-d34f-4493-b764-4aa4105b1922-kubelet-config\") pod \"global-pull-secret-syncer-989rq\" (UID: \"d2366368-d34f-4493-b764-4aa4105b1922\") " pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:34.925415 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:34.925343 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:31:34.925415 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:34.925404 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2366368-d34f-4493-b764-4aa4105b1922-original-pull-secret podName:d2366368-d34f-4493-b764-4aa4105b1922 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:35.425386659 +0000 UTC m=+46.514650243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d2366368-d34f-4493-b764-4aa4105b1922-original-pull-secret") pod "global-pull-secret-syncer-989rq" (UID: "d2366368-d34f-4493-b764-4aa4105b1922") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:31:34.925542 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:34.925500 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d2366368-d34f-4493-b764-4aa4105b1922-dbus\") pod \"global-pull-secret-syncer-989rq\" (UID: \"d2366368-d34f-4493-b764-4aa4105b1922\") " pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:35.429208 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:35.429179 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d2366368-d34f-4493-b764-4aa4105b1922-original-pull-secret\") pod \"global-pull-secret-syncer-989rq\" (UID: \"d2366368-d34f-4493-b764-4aa4105b1922\") " pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:35.429378 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:35.429294 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:31:35.429378 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:35.429344 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2366368-d34f-4493-b764-4aa4105b1922-original-pull-secret podName:d2366368-d34f-4493-b764-4aa4105b1922 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:36.429330214 +0000 UTC m=+47.518593785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d2366368-d34f-4493-b764-4aa4105b1922-original-pull-secret") pod "global-pull-secret-syncer-989rq" (UID: "d2366368-d34f-4493-b764-4aa4105b1922") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:31:36.436527 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:36.436496 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d2366368-d34f-4493-b764-4aa4105b1922-original-pull-secret\") pod \"global-pull-secret-syncer-989rq\" (UID: \"d2366368-d34f-4493-b764-4aa4105b1922\") " pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:36.436977 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:36.436607 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:31:36.436977 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:36.436657 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2366368-d34f-4493-b764-4aa4105b1922-original-pull-secret podName:d2366368-d34f-4493-b764-4aa4105b1922 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:38.436641442 +0000 UTC m=+49.525905013 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d2366368-d34f-4493-b764-4aa4105b1922-original-pull-secret") pod "global-pull-secret-syncer-989rq" (UID: "d2366368-d34f-4493-b764-4aa4105b1922") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:31:36.535196 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:36.535170 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:36.535196 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:36.535187 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:36.535353 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:36.535170 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:36.535353 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:36.535265 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2bwhf" podUID="ba95d7cd-292c-41ec-8417-d3768d65716d" Apr 16 18:31:36.535353 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:36.535334 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-989rq" podUID="d2366368-d34f-4493-b764-4aa4105b1922" Apr 16 18:31:36.535455 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:36.535430 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjjgw" podUID="642f0536-3a1b-4d5c-bb3d-e7128392b218" Apr 16 18:31:37.237278 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.237255 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-146.ec2.internal" event="NodeReady" Apr 16 18:31:37.237399 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.237387 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:31:37.277867 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.277845 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-65688f687b-mwsdh"] Apr 16 18:31:37.317372 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.317347 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-m2f22"] Apr 16 18:31:37.317507 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.317491 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.321707 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.321679 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:31:37.321707 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.321696 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-p5mxf\"" Apr 16 18:31:37.321873 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.321711 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:31:37.321873 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.321760 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:31:37.327964 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.327634 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:31:37.350489 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.350419 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6d74h"] Apr 16 18:31:37.350587 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.350518 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:37.354767 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.354750 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mnpbz\"" Apr 16 18:31:37.354920 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.354774 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:31:37.355015 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.354774 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:31:37.372289 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.372265 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-65688f687b-mwsdh"] Apr 16 18:31:37.372289 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.372289 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6d74h"] Apr 16 18:31:37.372412 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.372299 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m2f22"] Apr 16 18:31:37.372412 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.372308 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-m6kqv"] Apr 16 18:31:37.372412 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.372400 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.375182 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.375162 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:31:37.375345 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.375331 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:31:37.375429 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.375359 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:31:37.375429 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.375393 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:31:37.375624 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.375609 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-sjnps\"" Apr 16 18:31:37.390680 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.390661 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m6kqv"] Apr 16 18:31:37.390763 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.390754 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m6kqv" Apr 16 18:31:37.394112 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.394065 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-44gxs\"" Apr 16 18:31:37.394219 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.394201 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:31:37.394495 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.394475 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:31:37.395120 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.395102 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:31:37.442831 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.442803 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tnds\" (UniqueName: \"kubernetes.io/projected/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-kube-api-access-8tnds\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.442831 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.442831 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-registry-tls\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.443324 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.442863 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-trusted-ca\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.443324 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.442902 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5dcg\" (UniqueName: \"kubernetes.io/projected/0a157f4b-852b-4fc1-867b-72319f3a23ef-kube-api-access-x5dcg\") pod \"dns-default-m2f22\" (UID: \"0a157f4b-852b-4fc1-867b-72319f3a23ef\") " pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:37.443324 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.442947 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-registry-certificates\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.443324 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.442986 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-installation-pull-secrets\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.443324 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.443019 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-ca-trust-extracted\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.443324 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.443049 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a157f4b-852b-4fc1-867b-72319f3a23ef-metrics-tls\") pod \"dns-default-m2f22\" (UID: \"0a157f4b-852b-4fc1-867b-72319f3a23ef\") " pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:37.443324 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.443080 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-image-registry-private-configuration\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.443324 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.443096 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a157f4b-852b-4fc1-867b-72319f3a23ef-config-volume\") pod \"dns-default-m2f22\" (UID: \"0a157f4b-852b-4fc1-867b-72319f3a23ef\") " pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:37.443324 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.443122 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-bound-sa-token\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.443324 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.443139 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a157f4b-852b-4fc1-867b-72319f3a23ef-tmp-dir\") pod \"dns-default-m2f22\" (UID: \"0a157f4b-852b-4fc1-867b-72319f3a23ef\") " pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:37.544413 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544388 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-registry-certificates\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.544527 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544420 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-installation-pull-secrets\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.544527 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544447 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f721ab0c-9cef-4965-905d-b47537a6ad94-crio-socket\") pod \"insights-runtime-extractor-6d74h\" (UID: \"f721ab0c-9cef-4965-905d-b47537a6ad94\") " pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.544527 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544471 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f721ab0c-9cef-4965-905d-b47537a6ad94-data-volume\") pod \"insights-runtime-extractor-6d74h\" (UID: \"f721ab0c-9cef-4965-905d-b47537a6ad94\") " pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.544527 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544506 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-ca-trust-extracted\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.544718 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544549 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dt7s\" (UniqueName: \"kubernetes.io/projected/5db233fe-b415-4303-a7db-96df89fba6f1-kube-api-access-2dt7s\") pod \"ingress-canary-m6kqv\" (UID: \"5db233fe-b415-4303-a7db-96df89fba6f1\") " pod="openshift-ingress-canary/ingress-canary-m6kqv" Apr 16 18:31:37.544718 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544610 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f721ab0c-9cef-4965-905d-b47537a6ad94-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6d74h\" (UID: \"f721ab0c-9cef-4965-905d-b47537a6ad94\") " pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.544718 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544698 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dljw\" (UniqueName: \"kubernetes.io/projected/f721ab0c-9cef-4965-905d-b47537a6ad94-kube-api-access-5dljw\") pod \"insights-runtime-extractor-6d74h\" (UID: \"f721ab0c-9cef-4965-905d-b47537a6ad94\") " pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.544830 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544741 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a157f4b-852b-4fc1-867b-72319f3a23ef-metrics-tls\") pod \"dns-default-m2f22\" (UID: \"0a157f4b-852b-4fc1-867b-72319f3a23ef\") " pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:37.544830 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544783 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-image-registry-private-configuration\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.544830 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544804 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a157f4b-852b-4fc1-867b-72319f3a23ef-config-volume\") pod \"dns-default-m2f22\" (UID: \"0a157f4b-852b-4fc1-867b-72319f3a23ef\") " pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:37.544978 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544829 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f721ab0c-9cef-4965-905d-b47537a6ad94-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6d74h\" (UID: \"f721ab0c-9cef-4965-905d-b47537a6ad94\") " pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.544978 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544860 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-bound-sa-token\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.544978 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544882 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a157f4b-852b-4fc1-867b-72319f3a23ef-tmp-dir\") pod \"dns-default-m2f22\" (UID: \"0a157f4b-852b-4fc1-867b-72319f3a23ef\") " pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:37.544978 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544902 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5db233fe-b415-4303-a7db-96df89fba6f1-cert\") pod \"ingress-canary-m6kqv\" (UID: \"5db233fe-b415-4303-a7db-96df89fba6f1\") " pod="openshift-ingress-canary/ingress-canary-m6kqv" Apr 16 18:31:37.544978 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544931 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8tnds\" (UniqueName: \"kubernetes.io/projected/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-kube-api-access-8tnds\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.544978 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544958 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-registry-tls\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.545252 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544994 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-trusted-ca\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.545252 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.545019 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5dcg\" (UniqueName: \"kubernetes.io/projected/0a157f4b-852b-4fc1-867b-72319f3a23ef-kube-api-access-x5dcg\") pod \"dns-default-m2f22\" (UID: \"0a157f4b-852b-4fc1-867b-72319f3a23ef\") " pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:37.545336 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.545314 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-registry-certificates\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.545374 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.544882 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-ca-trust-extracted\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.545374 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.545362 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a157f4b-852b-4fc1-867b-72319f3a23ef-config-volume\") pod \"dns-default-m2f22\" (UID: \"0a157f4b-852b-4fc1-867b-72319f3a23ef\") " pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:37.545451 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.545368 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a157f4b-852b-4fc1-867b-72319f3a23ef-tmp-dir\") pod \"dns-default-m2f22\" (UID: \"0a157f4b-852b-4fc1-867b-72319f3a23ef\") " pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:37.546503 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.546482 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-trusted-ca\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.548836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.548808 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-image-registry-private-configuration\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.548836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.548824 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a157f4b-852b-4fc1-867b-72319f3a23ef-metrics-tls\") pod \"dns-default-m2f22\" (UID: \"0a157f4b-852b-4fc1-867b-72319f3a23ef\") " pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:37.548958 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.548872 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-registry-tls\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.548958 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.548897 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-installation-pull-secrets\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.556645 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.556618 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-bound-sa-token\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.556730 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.556640 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5dcg\" (UniqueName: \"kubernetes.io/projected/0a157f4b-852b-4fc1-867b-72319f3a23ef-kube-api-access-x5dcg\") pod \"dns-default-m2f22\" (UID: \"0a157f4b-852b-4fc1-867b-72319f3a23ef\") " pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:37.556959 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.556942 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tnds\" (UniqueName: \"kubernetes.io/projected/9922e3fb-d0ae-4fbd-b96a-81793a1f521a-kube-api-access-8tnds\") pod \"image-registry-65688f687b-mwsdh\" (UID: \"9922e3fb-d0ae-4fbd-b96a-81793a1f521a\") " pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.629978 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.629945 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:37.645627 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.645575 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f721ab0c-9cef-4965-905d-b47537a6ad94-crio-socket\") pod \"insights-runtime-extractor-6d74h\" (UID: \"f721ab0c-9cef-4965-905d-b47537a6ad94\") " pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.645627 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.645608 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f721ab0c-9cef-4965-905d-b47537a6ad94-data-volume\") pod \"insights-runtime-extractor-6d74h\" (UID: \"f721ab0c-9cef-4965-905d-b47537a6ad94\") " pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.645753 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.645641 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dt7s\" (UniqueName: \"kubernetes.io/projected/5db233fe-b415-4303-a7db-96df89fba6f1-kube-api-access-2dt7s\") pod \"ingress-canary-m6kqv\" (UID: \"5db233fe-b415-4303-a7db-96df89fba6f1\") " pod="openshift-ingress-canary/ingress-canary-m6kqv" Apr 16 18:31:37.645933 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.645756 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f721ab0c-9cef-4965-905d-b47537a6ad94-crio-socket\") pod \"insights-runtime-extractor-6d74h\" (UID: \"f721ab0c-9cef-4965-905d-b47537a6ad94\") " pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.645972 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.645772 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f721ab0c-9cef-4965-905d-b47537a6ad94-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6d74h\" (UID: \"f721ab0c-9cef-4965-905d-b47537a6ad94\") " pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.646006 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.645962 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f721ab0c-9cef-4965-905d-b47537a6ad94-data-volume\") pod \"insights-runtime-extractor-6d74h\" (UID: \"f721ab0c-9cef-4965-905d-b47537a6ad94\") " pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.646006 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.645986 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dljw\" (UniqueName: \"kubernetes.io/projected/f721ab0c-9cef-4965-905d-b47537a6ad94-kube-api-access-5dljw\") pod \"insights-runtime-extractor-6d74h\" (UID: \"f721ab0c-9cef-4965-905d-b47537a6ad94\") " pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.646067 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.646056 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f721ab0c-9cef-4965-905d-b47537a6ad94-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6d74h\" (UID: \"f721ab0c-9cef-4965-905d-b47537a6ad94\") " pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.646107 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.646094 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5db233fe-b415-4303-a7db-96df89fba6f1-cert\") pod \"ingress-canary-m6kqv\" (UID: \"5db233fe-b415-4303-a7db-96df89fba6f1\") " pod="openshift-ingress-canary/ingress-canary-m6kqv" Apr 16 18:31:37.646246 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.646226 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f721ab0c-9cef-4965-905d-b47537a6ad94-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6d74h\" (UID: \"f721ab0c-9cef-4965-905d-b47537a6ad94\") " pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.648286 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.648268 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f721ab0c-9cef-4965-905d-b47537a6ad94-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6d74h\" (UID: \"f721ab0c-9cef-4965-905d-b47537a6ad94\") " pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.648385 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.648298 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5db233fe-b415-4303-a7db-96df89fba6f1-cert\") pod \"ingress-canary-m6kqv\" (UID: \"5db233fe-b415-4303-a7db-96df89fba6f1\") " pod="openshift-ingress-canary/ingress-canary-m6kqv" Apr 16 18:31:37.653865 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.653844 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dljw\" (UniqueName: \"kubernetes.io/projected/f721ab0c-9cef-4965-905d-b47537a6ad94-kube-api-access-5dljw\") pod \"insights-runtime-extractor-6d74h\" (UID: \"f721ab0c-9cef-4965-905d-b47537a6ad94\") " pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.653865 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.653844 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dt7s\" (UniqueName: \"kubernetes.io/projected/5db233fe-b415-4303-a7db-96df89fba6f1-kube-api-access-2dt7s\") pod \"ingress-canary-m6kqv\" (UID: \"5db233fe-b415-4303-a7db-96df89fba6f1\") " pod="openshift-ingress-canary/ingress-canary-m6kqv" Apr 16 18:31:37.658733 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.658712 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:37.680560 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.680529 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6d74h" Apr 16 18:31:37.698268 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.698241 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m6kqv" Apr 16 18:31:37.833084 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.833050 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m2f22"] Apr 16 18:31:37.833315 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.833299 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-65688f687b-mwsdh"] Apr 16 18:31:37.854211 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.854135 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6d74h"] Apr 16 18:31:37.857661 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:31:37.857637 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf721ab0c_9cef_4965_905d_b47537a6ad94.slice/crio-f4c484732373d2d99751353d6ea5539bb739283f56b31e08eca0fbb1ef17864b WatchSource:0}: Error finding container f4c484732373d2d99751353d6ea5539bb739283f56b31e08eca0fbb1ef17864b: Status 404 returned error can't find the container with id f4c484732373d2d99751353d6ea5539bb739283f56b31e08eca0fbb1ef17864b Apr 16 18:31:37.896530 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.896509 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m6kqv"] Apr 16 18:31:37.900514 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:31:37.900484 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5db233fe_b415_4303_a7db_96df89fba6f1.slice/crio-705a78da4730213a7b3ffce1c7e609e02e6375ffb45c176ab55f4a89b87ba3aa WatchSource:0}: Error finding container 705a78da4730213a7b3ffce1c7e609e02e6375ffb45c176ab55f4a89b87ba3aa: Status 404 returned error can't find the container with id 705a78da4730213a7b3ffce1c7e609e02e6375ffb45c176ab55f4a89b87ba3aa Apr 16 18:31:37.915420 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.915399 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-678f445d6c-ps4hz"] Apr 16 18:31:37.926371 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.926349 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:37.929441 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.929422 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:31:37.929551 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.929444 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:31:37.930195 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.930178 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:31:37.930437 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.930421 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:31:37.930670 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.930657 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:31:37.931317 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.931294 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:31:37.932195 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.932175 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:31:37.933613 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.933068 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-xf4fr\"" Apr 16 18:31:37.933833 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:37.933805 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-678f445d6c-ps4hz"] Apr 16 18:31:38.052708 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.052674 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-service-ca\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.052852 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.052735 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-oauth-config\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.052852 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.052756 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-oauth-serving-cert\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.052852 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.052776 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-serving-cert\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.052984 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.052911 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfc44\" (UniqueName: \"kubernetes.io/projected/96663b4b-e1a0-4eee-bbf7-c53a50536f00-kube-api-access-lfc44\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.052984 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.052942 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-config\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.153836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.153758 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-oauth-config\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.153836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.153794 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-oauth-serving-cert\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.153836 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.153817 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-serving-cert\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.154073 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.153914 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfc44\" (UniqueName: \"kubernetes.io/projected/96663b4b-e1a0-4eee-bbf7-c53a50536f00-kube-api-access-lfc44\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.154073 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.153946 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-config\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.154073 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.153972 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-service-ca\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.154577 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.154545 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-oauth-serving-cert\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.154727 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.154688 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-config\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.154727 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.154706 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-service-ca\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.157358 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.157333 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-oauth-config\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.157358 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.157348 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-serving-cert\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.162052 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.162030 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfc44\" (UniqueName: \"kubernetes.io/projected/96663b4b-e1a0-4eee-bbf7-c53a50536f00-kube-api-access-lfc44\") pod \"console-678f445d6c-ps4hz\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.236299 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.236270 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:38.375832 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.375800 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-678f445d6c-ps4hz"] Apr 16 18:31:38.380252 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:31:38.380138 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96663b4b_e1a0_4eee_bbf7_c53a50536f00.slice/crio-a9badc9d776cfae2557ad1e9f72d0f73145dea4ab87fe3d276d6a3a72590a7ed WatchSource:0}: Error finding container a9badc9d776cfae2557ad1e9f72d0f73145dea4ab87fe3d276d6a3a72590a7ed: Status 404 returned error can't find the container with id a9badc9d776cfae2557ad1e9f72d0f73145dea4ab87fe3d276d6a3a72590a7ed Apr 16 18:31:38.457269 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.457030 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d2366368-d34f-4493-b764-4aa4105b1922-original-pull-secret\") pod \"global-pull-secret-syncer-989rq\" (UID: \"d2366368-d34f-4493-b764-4aa4105b1922\") " pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:38.457687 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:38.457271 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:31:38.457687 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:38.457345 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2366368-d34f-4493-b764-4aa4105b1922-original-pull-secret podName:d2366368-d34f-4493-b764-4aa4105b1922 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:42.45732588 +0000 UTC m=+53.546589465 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d2366368-d34f-4493-b764-4aa4105b1922-original-pull-secret") pod "global-pull-secret-syncer-989rq" (UID: "d2366368-d34f-4493-b764-4aa4105b1922") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:31:38.535570 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.534612 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:38.535570 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.534831 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:38.535570 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.535177 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:38.538572 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.537827 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:31:38.538572 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.537960 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2j6pd\"" Apr 16 18:31:38.538572 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.538042 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:31:38.538572 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.538188 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ds8t7\"" Apr 16 18:31:38.538572 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.538468 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:31:38.540510 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.540310 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:31:38.704847 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.704809 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m6kqv" event={"ID":"5db233fe-b415-4303-a7db-96df89fba6f1","Type":"ContainerStarted","Data":"705a78da4730213a7b3ffce1c7e609e02e6375ffb45c176ab55f4a89b87ba3aa"} Apr 16 18:31:38.706680 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.706643 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6d74h" event={"ID":"f721ab0c-9cef-4965-905d-b47537a6ad94","Type":"ContainerStarted","Data":"34e40f4d4adbbfc0194b7ddb0a3054dee38e44af46e1a45ff9da3b1d9ba99686"} Apr 16 18:31:38.706811 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.706683 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6d74h" event={"ID":"f721ab0c-9cef-4965-905d-b47537a6ad94","Type":"ContainerStarted","Data":"f4c484732373d2d99751353d6ea5539bb739283f56b31e08eca0fbb1ef17864b"} Apr 16 18:31:38.708307 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.708236 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m2f22" event={"ID":"0a157f4b-852b-4fc1-867b-72319f3a23ef","Type":"ContainerStarted","Data":"973ec03ff31d8b71c5ed8eb5a451d845fde3024341bbfdaa7392f0d9e720639e"} Apr 16 18:31:38.709899 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.709861 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-678f445d6c-ps4hz" event={"ID":"96663b4b-e1a0-4eee-bbf7-c53a50536f00","Type":"ContainerStarted","Data":"a9badc9d776cfae2557ad1e9f72d0f73145dea4ab87fe3d276d6a3a72590a7ed"} Apr 16 18:31:38.711425 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.711396 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65688f687b-mwsdh" event={"ID":"9922e3fb-d0ae-4fbd-b96a-81793a1f521a","Type":"ContainerStarted","Data":"06e4f02f79f83eaa0f144888940a21bc2754552c04387db1fa80b7c669a22151"} Apr 16 18:31:38.711525 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.711426 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65688f687b-mwsdh" event={"ID":"9922e3fb-d0ae-4fbd-b96a-81793a1f521a","Type":"ContainerStarted","Data":"d9e849dde1e739c32f21e1f3734c063663570f2780a3559fc3175a6e01a07c8c"} Apr 16 18:31:38.711657 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.711629 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:38.732715 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:38.732664 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-65688f687b-mwsdh" podStartSLOduration=6.732650562 podStartE2EDuration="6.732650562s" podCreationTimestamp="2026-04-16 18:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:31:38.731982788 +0000 UTC m=+49.821246402" watchObservedRunningTime="2026-04-16 18:31:38.732650562 +0000 UTC m=+49.821914157" Apr 16 18:31:40.718374 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:40.718343 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m6kqv" event={"ID":"5db233fe-b415-4303-a7db-96df89fba6f1","Type":"ContainerStarted","Data":"404c180162eca2f91726a317c387aa5074f4c9e4df8645cabb732371ce6a407d"} Apr 16 18:31:40.720336 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:40.720314 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6d74h" event={"ID":"f721ab0c-9cef-4965-905d-b47537a6ad94","Type":"ContainerStarted","Data":"3506288d221cb66636de76cd2d49009d1a468f6dd6f174f1dc1d6746383ab1b4"} Apr 16 18:31:40.721893 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:40.721871 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m2f22" event={"ID":"0a157f4b-852b-4fc1-867b-72319f3a23ef","Type":"ContainerStarted","Data":"df12afb1805ade1fe4059d014044243d7ae042227838930d576a341f7328707d"} Apr 16 18:31:40.735402 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:40.735352 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-m6kqv" podStartSLOduration=1.246021142 podStartE2EDuration="3.735338677s" podCreationTimestamp="2026-04-16 18:31:37 +0000 UTC" firstStartedPulling="2026-04-16 18:31:37.902545535 +0000 UTC m=+48.991809106" lastFinishedPulling="2026-04-16 18:31:40.391863061 +0000 UTC m=+51.481126641" observedRunningTime="2026-04-16 18:31:40.734718864 +0000 UTC m=+51.823982462" watchObservedRunningTime="2026-04-16 18:31:40.735338677 +0000 UTC m=+51.824602271" Apr 16 18:31:41.703545 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:41.703511 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh7nr"] Apr 16 18:31:41.744034 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:41.743977 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m2f22" event={"ID":"0a157f4b-852b-4fc1-867b-72319f3a23ef","Type":"ContainerStarted","Data":"8b57dffbe2eb839e1e5d5952cd4af1699a0c0713f7424b579bdd8e204df0cb64"} Apr 16 18:31:41.744034 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:41.744022 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh7nr"] Apr 16 18:31:41.744556 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:41.744495 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:41.744556 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:41.744543 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh7nr" Apr 16 18:31:41.747089 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:41.747068 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-kgln4\"" Apr 16 18:31:41.747237 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:41.747208 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 18:31:41.782721 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:41.782669 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-m2f22" podStartSLOduration=2.2413971 podStartE2EDuration="4.782653766s" podCreationTimestamp="2026-04-16 18:31:37 +0000 UTC" firstStartedPulling="2026-04-16 18:31:37.845817517 +0000 UTC m=+48.935081105" lastFinishedPulling="2026-04-16 18:31:40.387074186 +0000 UTC m=+51.476337771" observedRunningTime="2026-04-16 18:31:41.782232868 +0000 UTC m=+52.871496480" watchObservedRunningTime="2026-04-16 18:31:41.782653766 +0000 UTC m=+52.871917355" Apr 16 18:31:41.887019 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:41.886978 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e1ab7fe5-93ad-411e-8815-6ff339e612fc-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-wh7nr\" (UID: \"e1ab7fe5-93ad-411e-8815-6ff339e612fc\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh7nr" Apr 16 18:31:41.987845 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:41.987814 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e1ab7fe5-93ad-411e-8815-6ff339e612fc-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-wh7nr\" (UID: \"e1ab7fe5-93ad-411e-8815-6ff339e612fc\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh7nr" Apr 16 18:31:41.991714 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:41.991688 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e1ab7fe5-93ad-411e-8815-6ff339e612fc-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-wh7nr\" (UID: \"e1ab7fe5-93ad-411e-8815-6ff339e612fc\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh7nr" Apr 16 18:31:42.056359 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:42.056326 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh7nr" Apr 16 18:31:42.273603 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:42.273576 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh7nr"] Apr 16 18:31:42.278448 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:31:42.278415 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1ab7fe5_93ad_411e_8815_6ff339e612fc.slice/crio-995d679049416937c01d4268c36757586beb03947009a6c24871143208798fbf WatchSource:0}: Error finding container 995d679049416937c01d4268c36757586beb03947009a6c24871143208798fbf: Status 404 returned error can't find the container with id 995d679049416937c01d4268c36757586beb03947009a6c24871143208798fbf Apr 16 18:31:42.491846 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:42.491761 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d2366368-d34f-4493-b764-4aa4105b1922-original-pull-secret\") pod \"global-pull-secret-syncer-989rq\" (UID: \"d2366368-d34f-4493-b764-4aa4105b1922\") " pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:42.495385 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:42.495354 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d2366368-d34f-4493-b764-4aa4105b1922-original-pull-secret\") pod \"global-pull-secret-syncer-989rq\" (UID: \"d2366368-d34f-4493-b764-4aa4105b1922\") " pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:42.735208 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:42.735166 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-678f445d6c-ps4hz" event={"ID":"96663b4b-e1a0-4eee-bbf7-c53a50536f00","Type":"ContainerStarted","Data":"de4f2f14819c9b2876f4a3d6a62b83a452451090964a04d324674492e153454e"} Apr 16 18:31:42.736329 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:42.736298 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh7nr" event={"ID":"e1ab7fe5-93ad-411e-8815-6ff339e612fc","Type":"ContainerStarted","Data":"995d679049416937c01d4268c36757586beb03947009a6c24871143208798fbf"} Apr 16 18:31:42.749908 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:42.749852 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-989rq" Apr 16 18:31:42.754859 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:42.754820 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-678f445d6c-ps4hz" podStartSLOduration=2.002621146 podStartE2EDuration="5.754808537s" podCreationTimestamp="2026-04-16 18:31:37 +0000 UTC" firstStartedPulling="2026-04-16 18:31:38.382759174 +0000 UTC m=+49.472022751" lastFinishedPulling="2026-04-16 18:31:42.134946558 +0000 UTC m=+53.224210142" observedRunningTime="2026-04-16 18:31:42.754539801 +0000 UTC m=+53.843803394" watchObservedRunningTime="2026-04-16 18:31:42.754808537 +0000 UTC m=+53.844072130" Apr 16 18:31:43.240661 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.240425 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-989rq"] Apr 16 18:31:43.243950 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:31:43.243913 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2366368_d34f_4493_b764_4aa4105b1922.slice/crio-5dea6b44889c93ba71c611bac41f2ff76fbc8c0509aad9e1176db1f762f25bb3 WatchSource:0}: Error finding container 5dea6b44889c93ba71c611bac41f2ff76fbc8c0509aad9e1176db1f762f25bb3: Status 404 returned error can't find the container with id 5dea6b44889c93ba71c611bac41f2ff76fbc8c0509aad9e1176db1f762f25bb3 Apr 16 18:31:43.260725 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.260702 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c8f9bdf47-kv598"] Apr 16 18:31:43.285210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.285191 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c8f9bdf47-kv598"] Apr 16 18:31:43.285312 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.285300 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.292552 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.292531 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:31:43.296044 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.296024 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-oauth-serving-cert\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.296126 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.296059 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b5a2164-270c-45a5-a588-d6ae77297d86-console-oauth-config\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.296126 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.296076 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b5a2164-270c-45a5-a588-d6ae77297d86-console-serving-cert\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.296126 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.296096 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9hsg\" (UniqueName: \"kubernetes.io/projected/5b5a2164-270c-45a5-a588-d6ae77297d86-kube-api-access-m9hsg\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.296257 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.296180 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-console-config\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.296257 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.296207 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-trusted-ca-bundle\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.296257 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.296241 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-service-ca\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.397436 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.397366 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-oauth-serving-cert\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.397436 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.397413 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b5a2164-270c-45a5-a588-d6ae77297d86-console-oauth-config\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.397436 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.397437 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b5a2164-270c-45a5-a588-d6ae77297d86-console-serving-cert\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.397644 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.397550 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9hsg\" (UniqueName: \"kubernetes.io/projected/5b5a2164-270c-45a5-a588-d6ae77297d86-kube-api-access-m9hsg\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.397644 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.397596 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-console-config\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.397644 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.397625 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-trusted-ca-bundle\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.397787 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.397680 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-service-ca\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.398162 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.398115 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-oauth-serving-cert\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.398358 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.398333 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-service-ca\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.398457 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.398341 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-console-config\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.398654 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.398635 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-trusted-ca-bundle\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.400347 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.400325 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b5a2164-270c-45a5-a588-d6ae77297d86-console-oauth-config\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.400466 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.400448 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b5a2164-270c-45a5-a588-d6ae77297d86-console-serving-cert\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.405917 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.405895 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9hsg\" (UniqueName: \"kubernetes.io/projected/5b5a2164-270c-45a5-a588-d6ae77297d86-kube-api-access-m9hsg\") pod \"console-5c8f9bdf47-kv598\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.593867 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.593832 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:43.739718 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.739684 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-989rq" event={"ID":"d2366368-d34f-4493-b764-4aa4105b1922","Type":"ContainerStarted","Data":"5dea6b44889c93ba71c611bac41f2ff76fbc8c0509aad9e1176db1f762f25bb3"} Apr 16 18:31:43.920440 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:43.920414 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c8f9bdf47-kv598"] Apr 16 18:31:43.923298 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:31:43.923266 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b5a2164_270c_45a5_a588_d6ae77297d86.slice/crio-f3956496fdfadb38aaddaf161347f14df418a1404b604cd38a82afb16bed454b WatchSource:0}: Error finding container f3956496fdfadb38aaddaf161347f14df418a1404b604cd38a82afb16bed454b: Status 404 returned error can't find the container with id f3956496fdfadb38aaddaf161347f14df418a1404b604cd38a82afb16bed454b Apr 16 18:31:44.744581 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:44.744272 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh7nr" event={"ID":"e1ab7fe5-93ad-411e-8815-6ff339e612fc","Type":"ContainerStarted","Data":"5be100646309f5f12830febf106089c0c911caab2bc4511c6a0ac87f2d75892c"} Apr 16 18:31:44.744581 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:44.744551 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh7nr" Apr 16 18:31:44.746860 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:44.746813 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6d74h" event={"ID":"f721ab0c-9cef-4965-905d-b47537a6ad94","Type":"ContainerStarted","Data":"26ebc1be3e477e46ea36777e4c54b28e7cea8ae19e937168fd88ad20dc4cb9ad"} Apr 16 18:31:44.748687 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:44.748665 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c8f9bdf47-kv598" event={"ID":"5b5a2164-270c-45a5-a588-d6ae77297d86","Type":"ContainerStarted","Data":"7cecb7c3ac7f9f1fcdd018b84ebe8d994cdd6372b07f1b33c3dcbbe00279cb6b"} Apr 16 18:31:44.748827 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:44.748693 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c8f9bdf47-kv598" event={"ID":"5b5a2164-270c-45a5-a588-d6ae77297d86","Type":"ContainerStarted","Data":"f3956496fdfadb38aaddaf161347f14df418a1404b604cd38a82afb16bed454b"} Apr 16 18:31:44.750543 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:44.750521 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh7nr" Apr 16 18:31:44.763502 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:44.763463 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh7nr" podStartSLOduration=1.825060853 podStartE2EDuration="3.763451693s" podCreationTimestamp="2026-04-16 18:31:41 +0000 UTC" firstStartedPulling="2026-04-16 18:31:42.2810045 +0000 UTC m=+53.370268077" lastFinishedPulling="2026-04-16 18:31:44.219395333 +0000 UTC m=+55.308658917" observedRunningTime="2026-04-16 18:31:44.761362246 +0000 UTC m=+55.850625840" watchObservedRunningTime="2026-04-16 18:31:44.763451693 +0000 UTC m=+55.852715285" Apr 16 18:31:44.801686 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:44.801646 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6d74h" podStartSLOduration=1.948856744 podStartE2EDuration="7.801629934s" podCreationTimestamp="2026-04-16 18:31:37 +0000 UTC" firstStartedPulling="2026-04-16 18:31:37.956414146 +0000 UTC m=+49.045677717" lastFinishedPulling="2026-04-16 18:31:43.809187333 +0000 UTC m=+54.898450907" observedRunningTime="2026-04-16 18:31:44.801610783 +0000 UTC m=+55.890874380" watchObservedRunningTime="2026-04-16 18:31:44.801629934 +0000 UTC m=+55.890893519" Apr 16 18:31:44.819581 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:44.819536 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c8f9bdf47-kv598" podStartSLOduration=1.819523609 podStartE2EDuration="1.819523609s" podCreationTimestamp="2026-04-16 18:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:31:44.818596726 +0000 UTC m=+55.907860320" watchObservedRunningTime="2026-04-16 18:31:44.819523609 +0000 UTC m=+55.908787201" Apr 16 18:31:45.755884 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.755852 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-25s9k"] Apr 16 18:31:45.779431 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.779399 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-25s9k"] Apr 16 18:31:45.779582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.779477 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" Apr 16 18:31:45.783236 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.783205 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:31:45.783381 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.783255 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-7cndm\"" Apr 16 18:31:45.783381 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.783269 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:31:45.783381 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.783341 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:31:45.783381 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.783256 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 18:31:45.783597 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.783417 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 18:31:45.820243 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.820212 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/05c43021-6375-457e-857d-d95cce06e340-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-25s9k\" (UID: \"05c43021-6375-457e-857d-d95cce06e340\") " pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" Apr 16 18:31:45.820412 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.820250 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05c43021-6375-457e-857d-d95cce06e340-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-25s9k\" (UID: \"05c43021-6375-457e-857d-d95cce06e340\") " pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" Apr 16 18:31:45.820412 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.820287 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05c43021-6375-457e-857d-d95cce06e340-metrics-client-ca\") pod \"prometheus-operator-78f957474d-25s9k\" (UID: \"05c43021-6375-457e-857d-d95cce06e340\") " pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" Apr 16 18:31:45.820412 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.820375 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqn6r\" (UniqueName: \"kubernetes.io/projected/05c43021-6375-457e-857d-d95cce06e340-kube-api-access-tqn6r\") pod \"prometheus-operator-78f957474d-25s9k\" (UID: \"05c43021-6375-457e-857d-d95cce06e340\") " pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" Apr 16 18:31:45.921751 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.921717 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/05c43021-6375-457e-857d-d95cce06e340-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-25s9k\" (UID: \"05c43021-6375-457e-857d-d95cce06e340\") " pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" Apr 16 18:31:45.921931 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.921757 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05c43021-6375-457e-857d-d95cce06e340-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-25s9k\" (UID: \"05c43021-6375-457e-857d-d95cce06e340\") " pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" Apr 16 18:31:45.921931 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.921780 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05c43021-6375-457e-857d-d95cce06e340-metrics-client-ca\") pod \"prometheus-operator-78f957474d-25s9k\" (UID: \"05c43021-6375-457e-857d-d95cce06e340\") " pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" Apr 16 18:31:45.922033 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.921991 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqn6r\" (UniqueName: \"kubernetes.io/projected/05c43021-6375-457e-857d-d95cce06e340-kube-api-access-tqn6r\") pod \"prometheus-operator-78f957474d-25s9k\" (UID: \"05c43021-6375-457e-857d-d95cce06e340\") " pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" Apr 16 18:31:45.922641 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.922613 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05c43021-6375-457e-857d-d95cce06e340-metrics-client-ca\") pod \"prometheus-operator-78f957474d-25s9k\" (UID: \"05c43021-6375-457e-857d-d95cce06e340\") " pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" Apr 16 18:31:45.925183 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.925138 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/05c43021-6375-457e-857d-d95cce06e340-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-25s9k\" (UID: \"05c43021-6375-457e-857d-d95cce06e340\") " pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" Apr 16 18:31:45.925296 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.925179 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05c43021-6375-457e-857d-d95cce06e340-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-25s9k\" (UID: \"05c43021-6375-457e-857d-d95cce06e340\") " pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" Apr 16 18:31:45.931508 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:45.931478 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqn6r\" (UniqueName: \"kubernetes.io/projected/05c43021-6375-457e-857d-d95cce06e340-kube-api-access-tqn6r\") pod \"prometheus-operator-78f957474d-25s9k\" (UID: \"05c43021-6375-457e-857d-d95cce06e340\") " pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" Apr 16 18:31:46.090573 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:46.090498 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" Apr 16 18:31:47.490182 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:47.490133 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-25s9k"] Apr 16 18:31:47.493488 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:31:47.493456 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05c43021_6375_457e_857d_d95cce06e340.slice/crio-6a8a2aed21f3acf1e52bb266b60b2a42f426cc8b1ee6bbf6745fdcad69f5347a WatchSource:0}: Error finding container 6a8a2aed21f3acf1e52bb266b60b2a42f426cc8b1ee6bbf6745fdcad69f5347a: Status 404 returned error can't find the container with id 6a8a2aed21f3acf1e52bb266b60b2a42f426cc8b1ee6bbf6745fdcad69f5347a Apr 16 18:31:47.662975 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:47.662889 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6nzqh" Apr 16 18:31:47.757856 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:47.757818 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" event={"ID":"05c43021-6375-457e-857d-d95cce06e340","Type":"ContainerStarted","Data":"6a8a2aed21f3acf1e52bb266b60b2a42f426cc8b1ee6bbf6745fdcad69f5347a"} Apr 16 18:31:47.759091 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:47.759068 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-989rq" event={"ID":"d2366368-d34f-4493-b764-4aa4105b1922","Type":"ContainerStarted","Data":"3dd28bdc32f26c2e772ecb2ca97d1f26b094aa5dca6aacb090305db3ccd2b61c"} Apr 16 18:31:47.779699 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:47.779645 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-989rq" podStartSLOduration=9.601030827 podStartE2EDuration="13.779631846s" podCreationTimestamp="2026-04-16 18:31:34 +0000 UTC" firstStartedPulling="2026-04-16 18:31:43.245482952 +0000 UTC m=+54.334746523" lastFinishedPulling="2026-04-16 18:31:47.424083957 +0000 UTC m=+58.513347542" observedRunningTime="2026-04-16 18:31:47.77912637 +0000 UTC m=+58.868389965" watchObservedRunningTime="2026-04-16 18:31:47.779631846 +0000 UTC m=+58.868895438" Apr 16 18:31:48.237038 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:48.236999 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:48.237244 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:48.237091 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:31:48.238465 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:48.238440 2580 patch_prober.go:28] interesting pod/console-678f445d6c-ps4hz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.10:8443/health\": dial tcp 10.133.0.10:8443: connect: connection refused" start-of-body= Apr 16 18:31:48.238566 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:48.238501 2580 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-678f445d6c-ps4hz" podUID="96663b4b-e1a0-4eee-bbf7-c53a50536f00" containerName="console" probeResult="failure" output="Get \"https://10.133.0.10:8443/health\": dial tcp 10.133.0.10:8443: connect: connection refused" Apr 16 18:31:49.766658 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:49.766624 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" event={"ID":"05c43021-6375-457e-857d-d95cce06e340","Type":"ContainerStarted","Data":"9f0c4e8816eca3a205fe13a888770aea6720ad96594fb3556167992ae1850d95"} Apr 16 18:31:49.766658 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:49.766657 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" event={"ID":"05c43021-6375-457e-857d-d95cce06e340","Type":"ContainerStarted","Data":"3752afee74b491ae3262a12376e582d18f5fdd1f049bee7a9a18b8fa8ddd47eb"} Apr 16 18:31:49.784897 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:49.784852 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-25s9k" podStartSLOduration=3.434313645 podStartE2EDuration="4.784837313s" podCreationTimestamp="2026-04-16 18:31:45 +0000 UTC" firstStartedPulling="2026-04-16 18:31:47.49610752 +0000 UTC m=+58.585371094" lastFinishedPulling="2026-04-16 18:31:48.846631173 +0000 UTC m=+59.935894762" observedRunningTime="2026-04-16 18:31:49.784088677 +0000 UTC m=+60.873352270" watchObservedRunningTime="2026-04-16 18:31:49.784837313 +0000 UTC m=+60.874100905" Apr 16 18:31:51.738293 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:51.738263 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-m2f22" Apr 16 18:31:52.204179 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.204080 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb"] Apr 16 18:31:52.208466 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.208451 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" Apr 16 18:31:52.211675 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.211646 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 18:31:52.211803 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.211677 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-v9bx5\"" Apr 16 18:31:52.212132 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.212116 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:31:52.220616 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.220594 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb"] Apr 16 18:31:52.233948 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.233926 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-6dwrf"] Apr 16 18:31:52.238079 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.238061 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.240750 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.240731 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:31:52.240978 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.240731 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-2xpv7\"" Apr 16 18:31:52.241318 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.241302 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 18:31:52.242901 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.242886 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 18:31:52.248848 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.248826 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-6dwrf"] Apr 16 18:31:52.252774 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.252755 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-46rt7"] Apr 16 18:31:52.255955 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.255936 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.258627 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.258601 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:31:52.258731 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.258608 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:31:52.259423 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.259396 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:31:52.259509 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.259492 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fwjvq\"" Apr 16 18:31:52.272916 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.272888 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-root\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.273021 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.272926 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-tls\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.273021 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.272952 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/929db997-d29c-44d5-9142-d93108884045-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-5zjsb\" (UID: \"929db997-d29c-44d5-9142-d93108884045\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" Apr 16 18:31:52.273021 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.272997 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/51090ddf-429a-4d9e-a370-e6eb5bd84777-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.273187 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.273024 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/51090ddf-429a-4d9e-a370-e6eb5bd84777-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.273187 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.273077 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89rhn\" (UniqueName: \"kubernetes.io/projected/51090ddf-429a-4d9e-a370-e6eb5bd84777-kube-api-access-89rhn\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.273187 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.273096 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-textfile\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.273187 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.273112 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.273402 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.273195 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/929db997-d29c-44d5-9142-d93108884045-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-5zjsb\" (UID: \"929db997-d29c-44d5-9142-d93108884045\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" Apr 16 18:31:52.273402 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.273234 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-accelerators-collector-config\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.273402 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.273263 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51090ddf-429a-4d9e-a370-e6eb5bd84777-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.273402 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.273301 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqkkf\" (UniqueName: \"kubernetes.io/projected/929db997-d29c-44d5-9142-d93108884045-kube-api-access-xqkkf\") pod \"openshift-state-metrics-5669946b84-5zjsb\" (UID: \"929db997-d29c-44d5-9142-d93108884045\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" Apr 16 18:31:52.273402 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.273350 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/51090ddf-429a-4d9e-a370-e6eb5bd84777-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.273402 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.273378 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-metrics-client-ca\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.273658 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.273408 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/929db997-d29c-44d5-9142-d93108884045-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-5zjsb\" (UID: \"929db997-d29c-44d5-9142-d93108884045\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" Apr 16 18:31:52.273658 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.273443 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9p86\" (UniqueName: \"kubernetes.io/projected/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-kube-api-access-b9p86\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.273658 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.273470 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-sys\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.273658 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.273496 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-wtmp\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.273658 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.273526 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/51090ddf-429a-4d9e-a370-e6eb5bd84777-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.374201 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.374165 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-root\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.374392 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.374211 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-tls\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.374392 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.374229 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/929db997-d29c-44d5-9142-d93108884045-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-5zjsb\" (UID: \"929db997-d29c-44d5-9142-d93108884045\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" Apr 16 18:31:52.374392 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.374231 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-root\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.374392 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.374253 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/51090ddf-429a-4d9e-a370-e6eb5bd84777-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.374392 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.374301 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/51090ddf-429a-4d9e-a370-e6eb5bd84777-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.374392 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.374363 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89rhn\" (UniqueName: \"kubernetes.io/projected/51090ddf-429a-4d9e-a370-e6eb5bd84777-kube-api-access-89rhn\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.374392 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.374384 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-textfile\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.374737 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.374409 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.374737 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:52.374547 2580 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 18:31:52.374737 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:52.374611 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-tls podName:4c490b1a-eb3a-4699-8602-4ab84dc9d32b nodeName:}" failed. No retries permitted until 2026-04-16 18:31:52.874591686 +0000 UTC m=+63.963855268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-tls") pod "node-exporter-46rt7" (UID: "4c490b1a-eb3a-4699-8602-4ab84dc9d32b") : secret "node-exporter-tls" not found Apr 16 18:31:52.374906 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:52.374741 2580 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 18:31:52.374906 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.374781 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/929db997-d29c-44d5-9142-d93108884045-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-5zjsb\" (UID: \"929db997-d29c-44d5-9142-d93108884045\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" Apr 16 18:31:52.374906 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:52.374803 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51090ddf-429a-4d9e-a370-e6eb5bd84777-kube-state-metrics-tls podName:51090ddf-429a-4d9e-a370-e6eb5bd84777 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:52.874786016 +0000 UTC m=+63.964049601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/51090ddf-429a-4d9e-a370-e6eb5bd84777-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-6dwrf" (UID: "51090ddf-429a-4d9e-a370-e6eb5bd84777") : secret "kube-state-metrics-tls" not found Apr 16 18:31:52.374906 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.374860 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-accelerators-collector-config\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.374906 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:52.374874 2580 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 18:31:52.374906 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.374895 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51090ddf-429a-4d9e-a370-e6eb5bd84777-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.375240 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:52.374928 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/929db997-d29c-44d5-9142-d93108884045-openshift-state-metrics-tls podName:929db997-d29c-44d5-9142-d93108884045 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:52.874910661 +0000 UTC m=+63.964174252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/929db997-d29c-44d5-9142-d93108884045-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-5zjsb" (UID: "929db997-d29c-44d5-9142-d93108884045") : secret "openshift-state-metrics-tls" not found Apr 16 18:31:52.375240 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.374967 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqkkf\" (UniqueName: \"kubernetes.io/projected/929db997-d29c-44d5-9142-d93108884045-kube-api-access-xqkkf\") pod \"openshift-state-metrics-5669946b84-5zjsb\" (UID: \"929db997-d29c-44d5-9142-d93108884045\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" Apr 16 18:31:52.375240 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375008 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/51090ddf-429a-4d9e-a370-e6eb5bd84777-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.375240 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375011 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/929db997-d29c-44d5-9142-d93108884045-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-5zjsb\" (UID: \"929db997-d29c-44d5-9142-d93108884045\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" Apr 16 18:31:52.375240 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375046 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/51090ddf-429a-4d9e-a370-e6eb5bd84777-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.375240 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375078 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-metrics-client-ca\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.375240 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375117 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/929db997-d29c-44d5-9142-d93108884045-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-5zjsb\" (UID: \"929db997-d29c-44d5-9142-d93108884045\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" Apr 16 18:31:52.375240 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375129 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-textfile\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.375240 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375170 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9p86\" (UniqueName: \"kubernetes.io/projected/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-kube-api-access-b9p86\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.375240 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375217 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-sys\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.375745 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375249 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-wtmp\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.375745 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375280 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/51090ddf-429a-4d9e-a370-e6eb5bd84777-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.375745 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375338 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-sys\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.375745 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375342 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/51090ddf-429a-4d9e-a370-e6eb5bd84777-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.375745 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375472 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-wtmp\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.375745 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375567 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51090ddf-429a-4d9e-a370-e6eb5bd84777-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.376034 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375953 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-accelerators-collector-config\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.376034 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.375989 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-metrics-client-ca\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.377129 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.377096 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.377714 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.377696 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/929db997-d29c-44d5-9142-d93108884045-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-5zjsb\" (UID: \"929db997-d29c-44d5-9142-d93108884045\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" Apr 16 18:31:52.377768 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.377749 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/51090ddf-429a-4d9e-a370-e6eb5bd84777-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.396687 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.396650 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89rhn\" (UniqueName: \"kubernetes.io/projected/51090ddf-429a-4d9e-a370-e6eb5bd84777-kube-api-access-89rhn\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.397749 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.397724 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqkkf\" (UniqueName: \"kubernetes.io/projected/929db997-d29c-44d5-9142-d93108884045-kube-api-access-xqkkf\") pod \"openshift-state-metrics-5669946b84-5zjsb\" (UID: \"929db997-d29c-44d5-9142-d93108884045\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" Apr 16 18:31:52.398885 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.398415 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9p86\" (UniqueName: \"kubernetes.io/projected/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-kube-api-access-b9p86\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.880339 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.880281 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/929db997-d29c-44d5-9142-d93108884045-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-5zjsb\" (UID: \"929db997-d29c-44d5-9142-d93108884045\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" Apr 16 18:31:52.880813 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.880405 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-tls\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.880813 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.880438 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/51090ddf-429a-4d9e-a370-e6eb5bd84777-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:52.882760 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.882725 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4c490b1a-eb3a-4699-8602-4ab84dc9d32b-node-exporter-tls\") pod \"node-exporter-46rt7\" (UID: \"4c490b1a-eb3a-4699-8602-4ab84dc9d32b\") " pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:52.882865 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.882843 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/929db997-d29c-44d5-9142-d93108884045-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-5zjsb\" (UID: \"929db997-d29c-44d5-9142-d93108884045\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" Apr 16 18:31:52.883079 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:52.883061 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/51090ddf-429a-4d9e-a370-e6eb5bd84777-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-6dwrf\" (UID: \"51090ddf-429a-4d9e-a370-e6eb5bd84777\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:53.116761 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.116725 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" Apr 16 18:31:53.147195 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.147090 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" Apr 16 18:31:53.163995 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.163960 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-46rt7" Apr 16 18:31:53.173980 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:31:53.173874 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c490b1a_eb3a_4699_8602_4ab84dc9d32b.slice/crio-9e652c539d988873c5c577847e9a8cfbcd651f194b478488d2df9663e7f663d1 WatchSource:0}: Error finding container 9e652c539d988873c5c577847e9a8cfbcd651f194b478488d2df9663e7f663d1: Status 404 returned error can't find the container with id 9e652c539d988873c5c577847e9a8cfbcd651f194b478488d2df9663e7f663d1 Apr 16 18:31:53.259739 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.259704 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb"] Apr 16 18:31:53.263757 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:31:53.263685 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod929db997_d29c_44d5_9142_d93108884045.slice/crio-104aa8ac05322fe34eb2d8e767b51be3a01cf7b2ee9fed53ebf3542b98b6335f WatchSource:0}: Error finding container 104aa8ac05322fe34eb2d8e767b51be3a01cf7b2ee9fed53ebf3542b98b6335f: Status 404 returned error can't find the container with id 104aa8ac05322fe34eb2d8e767b51be3a01cf7b2ee9fed53ebf3542b98b6335f Apr 16 18:31:53.291044 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.291021 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-6dwrf"] Apr 16 18:31:53.295266 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:31:53.295240 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51090ddf_429a_4d9e_a370_e6eb5bd84777.slice/crio-598cfcfa926fece5fb0a3f02208b3c345ac812f026fce089eac2c06b3a475fcc WatchSource:0}: Error finding container 598cfcfa926fece5fb0a3f02208b3c345ac812f026fce089eac2c06b3a475fcc: Status 404 returned error can't find the container with id 598cfcfa926fece5fb0a3f02208b3c345ac812f026fce089eac2c06b3a475fcc Apr 16 18:31:53.305856 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.305834 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:31:53.310758 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.310740 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.314516 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.314495 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:31:53.314633 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.314613 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:31:53.314748 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.314728 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:31:53.314916 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.314900 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-7p7d5\"" Apr 16 18:31:53.315085 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.315066 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:31:53.315384 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.315357 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:31:53.315384 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.315377 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:31:53.315565 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.315437 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:31:53.315629 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.315594 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:31:53.315693 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.315658 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:31:53.335548 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.335526 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:31:53.383745 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.383714 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.383898 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.383759 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-tls-assets\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.383898 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.383830 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.383898 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.383871 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.384062 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.383919 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-config-out\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.384062 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.383949 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.384062 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.383976 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7cfz\" (UniqueName: \"kubernetes.io/projected/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-kube-api-access-m7cfz\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.384062 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.384013 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-web-config\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.384278 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.384115 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.384278 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.384183 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-config-volume\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.384278 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.384213 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.384278 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.384247 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.384394 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.384306 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.485569 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.485534 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.485712 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.485576 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.485712 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.485611 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-config-out\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.485712 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.485637 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.485863 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:31:53.485731 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-alertmanager-trusted-ca-bundle podName:aa3404b2-7cf8-4cc7-b029-8cf2ef45e206 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:53.985706288 +0000 UTC m=+65.074969861 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "aa3404b2-7cf8-4cc7-b029-8cf2ef45e206") : configmap references non-existent config key: ca-bundle.crt Apr 16 18:31:53.485863 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.485760 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7cfz\" (UniqueName: \"kubernetes.io/projected/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-kube-api-access-m7cfz\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.485863 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.485797 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-web-config\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.486067 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.485864 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.486067 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.485904 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-config-volume\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.486067 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.485928 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.486067 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.485963 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.486067 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.485985 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.486067 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.486011 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.486067 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.486043 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.486067 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.486070 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-tls-assets\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.487298 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.487207 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.488741 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.488696 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-config-volume\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.488853 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.488747 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.488911 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.488876 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-tls-assets\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.489010 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.488986 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.489220 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.489200 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-config-out\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.489406 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.489390 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.489466 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.489457 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-web-config\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.489681 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.489660 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.490657 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.490636 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.510574 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.510542 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7cfz\" (UniqueName: \"kubernetes.io/projected/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-kube-api-access-m7cfz\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.594768 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.594727 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:53.594768 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.594779 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:31:53.596228 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.596202 2580 patch_prober.go:28] interesting pod/console-5c8f9bdf47-kv598 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.12:8443/health\": dial tcp 10.133.0.12:8443: connect: connection refused" start-of-body= Apr 16 18:31:53.596374 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.596248 2580 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-5c8f9bdf47-kv598" podUID="5b5a2164-270c-45a5-a588-d6ae77297d86" containerName="console" probeResult="failure" output="Get \"https://10.133.0.12:8443/health\": dial tcp 10.133.0.12:8443: connect: connection refused" Apr 16 18:31:53.779709 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.779612 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-46rt7" event={"ID":"4c490b1a-eb3a-4699-8602-4ab84dc9d32b","Type":"ContainerStarted","Data":"9e652c539d988873c5c577847e9a8cfbcd651f194b478488d2df9663e7f663d1"} Apr 16 18:31:53.782132 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.782095 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" event={"ID":"929db997-d29c-44d5-9142-d93108884045","Type":"ContainerStarted","Data":"7fa2f7973d726d5d1c6c48fbbfae3181e079e59ebebfccd867ea53130590f868"} Apr 16 18:31:53.782295 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.782163 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" event={"ID":"929db997-d29c-44d5-9142-d93108884045","Type":"ContainerStarted","Data":"a544a21d12d7749637bf2ffc080c2285513f3d0070884f3da00096920b02194c"} Apr 16 18:31:53.782295 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.782192 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" event={"ID":"929db997-d29c-44d5-9142-d93108884045","Type":"ContainerStarted","Data":"104aa8ac05322fe34eb2d8e767b51be3a01cf7b2ee9fed53ebf3542b98b6335f"} Apr 16 18:31:53.784558 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.784534 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" event={"ID":"51090ddf-429a-4d9e-a370-e6eb5bd84777","Type":"ContainerStarted","Data":"598cfcfa926fece5fb0a3f02208b3c345ac812f026fce089eac2c06b3a475fcc"} Apr 16 18:31:53.990244 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.990207 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:53.991028 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:53.991001 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa3404b2-7cf8-4cc7-b029-8cf2ef45e206-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:54.221006 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.220909 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:31:54.296898 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.296865 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-d54fc9d9-x7l2k"] Apr 16 18:31:54.301776 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.301733 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.306489 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.306104 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-ctuch0o2j8dd6\"" Apr 16 18:31:54.308955 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.308714 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 18:31:54.309073 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.309010 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 18:31:54.316668 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.309320 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 18:31:54.316668 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.310224 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 18:31:54.316668 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.310608 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-n7qz8\"" Apr 16 18:31:54.316668 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.310867 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 18:31:54.323427 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.321702 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-d54fc9d9-x7l2k"] Apr 16 18:31:54.396470 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.396426 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.396639 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.396486 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.396639 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.396557 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-thanos-querier-tls\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.396639 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.396587 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.396799 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.396637 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.396799 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.396676 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca31ecdf-a575-4275-9d30-6215d448e698-metrics-client-ca\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.396799 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.396759 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-grpc-tls\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.396941 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.396829 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czbn2\" (UniqueName: \"kubernetes.io/projected/ca31ecdf-a575-4275-9d30-6215d448e698-kube-api-access-czbn2\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.398687 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.398513 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:31:54.497632 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.497590 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-grpc-tls\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.497831 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.497682 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czbn2\" (UniqueName: \"kubernetes.io/projected/ca31ecdf-a575-4275-9d30-6215d448e698-kube-api-access-czbn2\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.497831 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.497715 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.497831 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.497740 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.497831 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.497787 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-thanos-querier-tls\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.497831 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.497812 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.498084 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.497847 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.498084 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.497871 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca31ecdf-a575-4275-9d30-6215d448e698-metrics-client-ca\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.498706 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.498656 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca31ecdf-a575-4275-9d30-6215d448e698-metrics-client-ca\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.501107 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.501058 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-thanos-querier-tls\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.501648 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.501603 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.501737 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.501699 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.501973 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.501952 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-grpc-tls\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.502342 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.502320 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.502730 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.502683 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ca31ecdf-a575-4275-9d30-6215d448e698-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.507202 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.507180 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czbn2\" (UniqueName: \"kubernetes.io/projected/ca31ecdf-a575-4275-9d30-6215d448e698-kube-api-access-czbn2\") pod \"thanos-querier-d54fc9d9-x7l2k\" (UID: \"ca31ecdf-a575-4275-9d30-6215d448e698\") " pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.603223 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:31:54.603189 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa3404b2_7cf8_4cc7_b029_8cf2ef45e206.slice/crio-29d28df992a01659be90fc8a593a6991d0037ec82a01add2ef558a449582b01c WatchSource:0}: Error finding container 29d28df992a01659be90fc8a593a6991d0037ec82a01add2ef558a449582b01c: Status 404 returned error can't find the container with id 29d28df992a01659be90fc8a593a6991d0037ec82a01add2ef558a449582b01c Apr 16 18:31:54.638202 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.638125 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:31:54.788910 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.788812 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206","Type":"ContainerStarted","Data":"29d28df992a01659be90fc8a593a6991d0037ec82a01add2ef558a449582b01c"} Apr 16 18:31:54.790445 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.790416 2580 generic.go:358] "Generic (PLEG): container finished" podID="4c490b1a-eb3a-4699-8602-4ab84dc9d32b" containerID="b013e38a6ee62abb6bf5c2f05c3b5c3af8e56fb16e878dd99c1a734626bdddc8" exitCode=0 Apr 16 18:31:54.790590 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:54.790485 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-46rt7" event={"ID":"4c490b1a-eb3a-4699-8602-4ab84dc9d32b","Type":"ContainerDied","Data":"b013e38a6ee62abb6bf5c2f05c3b5c3af8e56fb16e878dd99c1a734626bdddc8"} Apr 16 18:31:55.121006 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.120956 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-d54fc9d9-x7l2k"] Apr 16 18:31:55.205788 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.205404 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs\") pod \"network-metrics-daemon-sjjgw\" (UID: \"642f0536-3a1b-4d5c-bb3d-e7128392b218\") " pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:55.209341 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.209307 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:31:55.218246 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.218217 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/642f0536-3a1b-4d5c-bb3d-e7128392b218-metrics-certs\") pod \"network-metrics-daemon-sjjgw\" (UID: \"642f0536-3a1b-4d5c-bb3d-e7128392b218\") " pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:55.306843 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.306761 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvr9m\" (UniqueName: \"kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m\") pod \"network-check-target-2bwhf\" (UID: \"ba95d7cd-292c-41ec-8417-d3768d65716d\") " pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:55.310913 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.310886 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:31:55.321463 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.321440 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:31:55.329958 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.329928 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvr9m\" (UniqueName: \"kubernetes.io/projected/ba95d7cd-292c-41ec-8417-d3768d65716d-kube-api-access-cvr9m\") pod \"network-check-target-2bwhf\" (UID: \"ba95d7cd-292c-41ec-8417-d3768d65716d\") " pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:55.384824 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.384793 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2j6pd\"" Apr 16 18:31:55.392017 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.391993 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ds8t7\"" Apr 16 18:31:55.393041 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.393019 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjjgw" Apr 16 18:31:55.400792 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.400766 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:31:55.549494 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.549440 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sjjgw"] Apr 16 18:31:55.580983 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.580899 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2bwhf"] Apr 16 18:31:55.584269 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:31:55.584236 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba95d7cd_292c_41ec_8417_d3768d65716d.slice/crio-c7da3eff987fc5e43a260f90ca3db267d919bb4461b9fbb26275787633f08331 WatchSource:0}: Error finding container c7da3eff987fc5e43a260f90ca3db267d919bb4461b9fbb26275787633f08331: Status 404 returned error can't find the container with id c7da3eff987fc5e43a260f90ca3db267d919bb4461b9fbb26275787633f08331 Apr 16 18:31:55.796462 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.796409 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" event={"ID":"ca31ecdf-a575-4275-9d30-6215d448e698","Type":"ContainerStarted","Data":"c4e406b158d0e123a2e0bd98b02a67e87c027d14d72105687d13f2f5f8095a6c"} Apr 16 18:31:55.798866 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.798805 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" event={"ID":"51090ddf-429a-4d9e-a370-e6eb5bd84777","Type":"ContainerStarted","Data":"9bee013d8fa7f1fb53d5f8fbd20d94f4148b9670161b2b14d3713946054bf687"} Apr 16 18:31:55.798866 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.798846 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" event={"ID":"51090ddf-429a-4d9e-a370-e6eb5bd84777","Type":"ContainerStarted","Data":"e2ea2697a2cf3f803a540b66f2c34153ca3c8d644bc1d8968c249e7e43f2187b"} Apr 16 18:31:55.798866 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.798860 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" event={"ID":"51090ddf-429a-4d9e-a370-e6eb5bd84777","Type":"ContainerStarted","Data":"eca5f3c30dbfef8bf724891cd5280f75b17b1c533bcef5aebbd6afcf42439530"} Apr 16 18:31:55.800415 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.800369 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2bwhf" event={"ID":"ba95d7cd-292c-41ec-8417-d3768d65716d","Type":"ContainerStarted","Data":"c7da3eff987fc5e43a260f90ca3db267d919bb4461b9fbb26275787633f08331"} Apr 16 18:31:55.802930 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.802858 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-46rt7" event={"ID":"4c490b1a-eb3a-4699-8602-4ab84dc9d32b","Type":"ContainerStarted","Data":"d73398b9f2ab92c4390ffe207b8c5d6484ca681707abd2b7e154c9f45f1040eb"} Apr 16 18:31:55.802930 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.802891 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-46rt7" event={"ID":"4c490b1a-eb3a-4699-8602-4ab84dc9d32b","Type":"ContainerStarted","Data":"45e9563fe56b53fe7106cbe5bc2d2684e82ddd552f1a8797140cf56ce9e1aad1"} Apr 16 18:31:55.805633 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.805590 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" event={"ID":"929db997-d29c-44d5-9142-d93108884045","Type":"ContainerStarted","Data":"e42983aa8aea772211d5abb7cfad4128f7aef95b31f928854863128f31f4599c"} Apr 16 18:31:55.807093 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.807054 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sjjgw" event={"ID":"642f0536-3a1b-4d5c-bb3d-e7128392b218","Type":"ContainerStarted","Data":"0879bc3b2b20977eae5b180afc0d11339e042b8256765ea474745e57c9b4b9a7"} Apr 16 18:31:55.819630 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.819523 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-6dwrf" podStartSLOduration=2.159347309 podStartE2EDuration="3.8194952s" podCreationTimestamp="2026-04-16 18:31:52 +0000 UTC" firstStartedPulling="2026-04-16 18:31:53.29729873 +0000 UTC m=+64.386562303" lastFinishedPulling="2026-04-16 18:31:54.95744662 +0000 UTC m=+66.046710194" observedRunningTime="2026-04-16 18:31:55.819203305 +0000 UTC m=+66.908466899" watchObservedRunningTime="2026-04-16 18:31:55.8194952 +0000 UTC m=+66.908758794" Apr 16 18:31:55.846416 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.846355 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-46rt7" podStartSLOduration=3.181765754 podStartE2EDuration="3.846333639s" podCreationTimestamp="2026-04-16 18:31:52 +0000 UTC" firstStartedPulling="2026-04-16 18:31:53.176462039 +0000 UTC m=+64.265725614" lastFinishedPulling="2026-04-16 18:31:53.841029916 +0000 UTC m=+64.930293499" observedRunningTime="2026-04-16 18:31:55.844823416 +0000 UTC m=+66.934087020" watchObservedRunningTime="2026-04-16 18:31:55.846333639 +0000 UTC m=+66.935597244" Apr 16 18:31:55.866921 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:55.866861 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-5zjsb" podStartSLOduration=2.307713264 podStartE2EDuration="3.866841629s" podCreationTimestamp="2026-04-16 18:31:52 +0000 UTC" firstStartedPulling="2026-04-16 18:31:53.395252446 +0000 UTC m=+64.484516017" lastFinishedPulling="2026-04-16 18:31:54.954380803 +0000 UTC m=+66.043644382" observedRunningTime="2026-04-16 18:31:55.866179753 +0000 UTC m=+66.955443347" watchObservedRunningTime="2026-04-16 18:31:55.866841629 +0000 UTC m=+66.956105221" Apr 16 18:31:56.724016 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.723975 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5776bc855b-mrbhh"] Apr 16 18:31:56.735050 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.735022 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.737999 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.737972 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 18:31:56.738965 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.738933 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:31:56.738965 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.738933 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-8rp42\"" Apr 16 18:31:56.739127 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.739030 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 18:31:56.739326 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.739307 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-ahbu44dm3tf97\"" Apr 16 18:31:56.739450 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.739434 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 18:31:56.742355 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.742304 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5776bc855b-mrbhh"] Apr 16 18:31:56.812679 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.812227 2580 generic.go:358] "Generic (PLEG): container finished" podID="aa3404b2-7cf8-4cc7-b029-8cf2ef45e206" containerID="1e136a3f27ccd87a965fa0601e395abe67019f5cf1fb394b3f5cc6287244e943" exitCode=0 Apr 16 18:31:56.812679 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.812471 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206","Type":"ContainerDied","Data":"1e136a3f27ccd87a965fa0601e395abe67019f5cf1fb394b3f5cc6287244e943"} Apr 16 18:31:56.823394 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.823344 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e0336541-4641-42a2-9a31-69bf1667218c-secret-metrics-server-client-certs\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.823528 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.823399 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e0336541-4641-42a2-9a31-69bf1667218c-audit-log\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.823528 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.823451 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e0336541-4641-42a2-9a31-69bf1667218c-secret-metrics-server-tls\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.823528 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.823480 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0336541-4641-42a2-9a31-69bf1667218c-client-ca-bundle\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.823680 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.823530 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj82r\" (UniqueName: \"kubernetes.io/projected/e0336541-4641-42a2-9a31-69bf1667218c-kube-api-access-zj82r\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.823680 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.823569 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0336541-4641-42a2-9a31-69bf1667218c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.823680 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.823596 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e0336541-4641-42a2-9a31-69bf1667218c-metrics-server-audit-profiles\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.924921 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.924886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e0336541-4641-42a2-9a31-69bf1667218c-secret-metrics-server-tls\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.924921 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.924929 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0336541-4641-42a2-9a31-69bf1667218c-client-ca-bundle\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.925173 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.925039 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zj82r\" (UniqueName: \"kubernetes.io/projected/e0336541-4641-42a2-9a31-69bf1667218c-kube-api-access-zj82r\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.925575 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.925542 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0336541-4641-42a2-9a31-69bf1667218c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.925675 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.925593 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e0336541-4641-42a2-9a31-69bf1667218c-metrics-server-audit-profiles\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.926539 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.926481 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0336541-4641-42a2-9a31-69bf1667218c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.926686 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.926667 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e0336541-4641-42a2-9a31-69bf1667218c-metrics-server-audit-profiles\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.926766 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.926670 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e0336541-4641-42a2-9a31-69bf1667218c-secret-metrics-server-client-certs\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.926766 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.926739 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e0336541-4641-42a2-9a31-69bf1667218c-audit-log\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.927381 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.927349 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e0336541-4641-42a2-9a31-69bf1667218c-audit-log\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.928186 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.928125 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0336541-4641-42a2-9a31-69bf1667218c-client-ca-bundle\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.928281 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.928249 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e0336541-4641-42a2-9a31-69bf1667218c-secret-metrics-server-tls\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.928992 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.928970 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e0336541-4641-42a2-9a31-69bf1667218c-secret-metrics-server-client-certs\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.938420 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.938363 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj82r\" (UniqueName: \"kubernetes.io/projected/e0336541-4641-42a2-9a31-69bf1667218c-kube-api-access-zj82r\") pod \"metrics-server-5776bc855b-mrbhh\" (UID: \"e0336541-4641-42a2-9a31-69bf1667218c\") " pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:56.990002 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:56.989913 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-678f445d6c-ps4hz"] Apr 16 18:31:57.008224 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.008133 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-j6nsg"] Apr 16 18:31:57.034715 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.034663 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-748f7895cf-kxtl9"] Apr 16 18:31:57.034878 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.034805 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j6nsg" Apr 16 18:31:57.039021 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.037546 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 18:31:57.039021 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.037825 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-gvjs7\"" Apr 16 18:31:57.050786 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.050735 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:31:57.056615 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.056589 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-j6nsg"] Apr 16 18:31:57.056615 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.056617 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-748f7895cf-kxtl9"] Apr 16 18:31:57.056798 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.056711 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.129516 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.129480 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-service-ca\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.129850 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.129530 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62209e8e-6f79-4034-ad78-880129363760-console-oauth-config\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.129850 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.129555 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-oauth-serving-cert\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.129850 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.129709 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-trusted-ca-bundle\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.129850 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.129791 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/31981e96-805e-41b8-a6dd-0d2bf7af45d8-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-j6nsg\" (UID: \"31981e96-805e-41b8-a6dd-0d2bf7af45d8\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j6nsg" Apr 16 18:31:57.129850 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.129825 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-console-config\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.130121 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.129892 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thv8l\" (UniqueName: \"kubernetes.io/projected/62209e8e-6f79-4034-ad78-880129363760-kube-api-access-thv8l\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.130121 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.129924 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62209e8e-6f79-4034-ad78-880129363760-console-serving-cert\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.231440 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.231335 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thv8l\" (UniqueName: \"kubernetes.io/projected/62209e8e-6f79-4034-ad78-880129363760-kube-api-access-thv8l\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.231440 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.231409 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62209e8e-6f79-4034-ad78-880129363760-console-serving-cert\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.231693 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.231458 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-service-ca\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.231693 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.231487 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62209e8e-6f79-4034-ad78-880129363760-console-oauth-config\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.231693 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.231512 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-oauth-serving-cert\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.231693 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.231579 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-trusted-ca-bundle\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.231693 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.231614 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/31981e96-805e-41b8-a6dd-0d2bf7af45d8-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-j6nsg\" (UID: \"31981e96-805e-41b8-a6dd-0d2bf7af45d8\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j6nsg" Apr 16 18:31:57.231693 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.231642 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-console-config\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.232479 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.232424 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-oauth-serving-cert\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.232896 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.232808 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-service-ca\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.232998 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.232975 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-console-config\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.233875 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.233828 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-trusted-ca-bundle\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.235598 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.235557 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62209e8e-6f79-4034-ad78-880129363760-console-serving-cert\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.235598 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.235585 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62209e8e-6f79-4034-ad78-880129363760-console-oauth-config\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.235736 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.235617 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/31981e96-805e-41b8-a6dd-0d2bf7af45d8-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-j6nsg\" (UID: \"31981e96-805e-41b8-a6dd-0d2bf7af45d8\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j6nsg" Apr 16 18:31:57.240675 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.240613 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thv8l\" (UniqueName: \"kubernetes.io/projected/62209e8e-6f79-4034-ad78-880129363760-kube-api-access-thv8l\") pod \"console-748f7895cf-kxtl9\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.348505 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.348425 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j6nsg" Apr 16 18:31:57.369650 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.369585 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:31:57.635317 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.635224 2580 patch_prober.go:28] interesting pod/image-registry-65688f687b-mwsdh container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:31:57.635317 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:57.635297 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-65688f687b-mwsdh" podUID="9922e3fb-d0ae-4fbd-b96a-81793a1f521a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:31:58.617760 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.617691 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:31:58.636076 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.636042 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.641128 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.640956 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:31:58.641963 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.641944 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:31:58.642246 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.641991 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:31:58.648207 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.647783 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:31:58.648207 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.648038 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:31:58.649174 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.648684 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:31:58.649174 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.648699 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:31:58.649994 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.649606 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bdf5vmu56v13o\"" Apr 16 18:31:58.649994 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.649832 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:31:58.650503 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.650272 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:31:58.650503 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.650415 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:31:58.650842 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.650668 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:31:58.650842 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.650715 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:31:58.650842 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.650783 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-kjgvk\"" Apr 16 18:31:58.651216 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.651081 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:31:58.744987 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.744948 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-config-out\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745237 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.744999 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745237 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.745086 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745237 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.745211 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745412 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.745254 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745412 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.745318 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745412 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.745379 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745606 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.745409 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745606 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.745463 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-config\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745606 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.745487 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zqmx\" (UniqueName: \"kubernetes.io/projected/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-kube-api-access-8zqmx\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745606 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.745548 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745606 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.745577 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745831 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.745661 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745831 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.745697 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745831 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.745724 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745944 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.745918 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-web-config\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.745983 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.745970 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.746024 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.746002 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.846583 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.846541 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.846764 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.846598 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.846764 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.846628 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.846764 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.846679 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-web-config\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.846764 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.846715 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.846764 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.846746 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.847008 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.846785 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-config-out\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.847008 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.846809 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.847008 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.846844 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.847008 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.846871 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.847008 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.846896 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.847008 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.846932 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.847008 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.846967 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.847008 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.846995 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.847433 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.847020 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-config\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.847433 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.847043 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zqmx\" (UniqueName: \"kubernetes.io/projected/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-kube-api-access-8zqmx\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.847433 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.847081 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.847433 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.847108 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.847433 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.847426 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.848203 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.847845 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.848203 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.848166 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.848828 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.848465 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.849200 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.849113 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.849983 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.849942 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.850245 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.850194 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.850626 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.850601 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.850733 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.850703 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.851331 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.851308 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-config-out\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.860201 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.860088 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.860776 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.860722 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.862820 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.862589 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.862820 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.862648 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.863295 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.863252 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-config\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.863756 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.863715 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-web-config\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.868092 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.868067 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.870759 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.870736 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zqmx\" (UniqueName: \"kubernetes.io/projected/73d5455b-55b6-44a7-8c92-e363b7d1b2f2-kube-api-access-8zqmx\") pod \"prometheus-k8s-0\" (UID: \"73d5455b-55b6-44a7-8c92-e363b7d1b2f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:58.952359 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:58.951876 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:31:59.719655 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:59.719626 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-65688f687b-mwsdh" Apr 16 18:31:59.723452 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:59.723352 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-748f7895cf-kxtl9"] Apr 16 18:31:59.739115 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:31:59.739082 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-j6nsg"] Apr 16 18:31:59.988988 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:31:59.988953 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62209e8e_6f79_4034_ad78_880129363760.slice/crio-dc1f2474a90cd4e2bf31a60c6dfe8dcf1f8a49c85f72685c3b7323d220c75ad1 WatchSource:0}: Error finding container dc1f2474a90cd4e2bf31a60c6dfe8dcf1f8a49c85f72685c3b7323d220c75ad1: Status 404 returned error can't find the container with id dc1f2474a90cd4e2bf31a60c6dfe8dcf1f8a49c85f72685c3b7323d220c75ad1 Apr 16 18:31:59.990939 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:31:59.990903 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31981e96_805e_41b8_a6dd_0d2bf7af45d8.slice/crio-32fe12f9c81c7e005f3b37753964a22b4ea149102c08fbbb02fe9abfb88ee7e9 WatchSource:0}: Error finding container 32fe12f9c81c7e005f3b37753964a22b4ea149102c08fbbb02fe9abfb88ee7e9: Status 404 returned error can't find the container with id 32fe12f9c81c7e005f3b37753964a22b4ea149102c08fbbb02fe9abfb88ee7e9 Apr 16 18:32:00.165541 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.164934 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5776bc855b-mrbhh"] Apr 16 18:32:00.168632 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:32:00.168595 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0336541_4641_42a2_9a31_69bf1667218c.slice/crio-1a7693a5c657545795b1710f656ea16c9caab92a82c71ae1b95288cb457cdc78 WatchSource:0}: Error finding container 1a7693a5c657545795b1710f656ea16c9caab92a82c71ae1b95288cb457cdc78: Status 404 returned error can't find the container with id 1a7693a5c657545795b1710f656ea16c9caab92a82c71ae1b95288cb457cdc78 Apr 16 18:32:00.212989 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.212469 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:32:00.215902 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:32:00.215868 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d5455b_55b6_44a7_8c92_e363b7d1b2f2.slice/crio-352e1a518d2ee8738dcf3b17a006a2859b12ef2956060ebbb8bd773129b5485b WatchSource:0}: Error finding container 352e1a518d2ee8738dcf3b17a006a2859b12ef2956060ebbb8bd773129b5485b: Status 404 returned error can't find the container with id 352e1a518d2ee8738dcf3b17a006a2859b12ef2956060ebbb8bd773129b5485b Apr 16 18:32:00.827715 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.827637 2580 generic.go:358] "Generic (PLEG): container finished" podID="73d5455b-55b6-44a7-8c92-e363b7d1b2f2" containerID="eb2f216f7bfca2f5119bcf76cbde3daac98f8b1d51d2bb7fd163573ed0d2df8a" exitCode=0 Apr 16 18:32:00.828221 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.827736 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73d5455b-55b6-44a7-8c92-e363b7d1b2f2","Type":"ContainerDied","Data":"eb2f216f7bfca2f5119bcf76cbde3daac98f8b1d51d2bb7fd163573ed0d2df8a"} Apr 16 18:32:00.828221 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.827767 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73d5455b-55b6-44a7-8c92-e363b7d1b2f2","Type":"ContainerStarted","Data":"352e1a518d2ee8738dcf3b17a006a2859b12ef2956060ebbb8bd773129b5485b"} Apr 16 18:32:00.829554 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.829525 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j6nsg" event={"ID":"31981e96-805e-41b8-a6dd-0d2bf7af45d8","Type":"ContainerStarted","Data":"32fe12f9c81c7e005f3b37753964a22b4ea149102c08fbbb02fe9abfb88ee7e9"} Apr 16 18:32:00.832698 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.832619 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" event={"ID":"ca31ecdf-a575-4275-9d30-6215d448e698","Type":"ContainerStarted","Data":"2d7c696f18ba01695b66c6d9c28b7cbe8a9634f9749a3fe7bb9b67f90ffdc5d0"} Apr 16 18:32:00.832698 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.832657 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" event={"ID":"ca31ecdf-a575-4275-9d30-6215d448e698","Type":"ContainerStarted","Data":"1d1b550bd3c68b14aba868d7808cf04e2cb8d2f7c14561ac053e5ae9fdf0f3d8"} Apr 16 18:32:00.832698 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.832674 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" event={"ID":"ca31ecdf-a575-4275-9d30-6215d448e698","Type":"ContainerStarted","Data":"a15ce55c79fa74890ea3b3eb533ee284fba19e0b26cb835ee77d5af3b90e05de"} Apr 16 18:32:00.836702 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.836667 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206","Type":"ContainerStarted","Data":"a0c7df5c63d3c007b365736f6137519354a14f32399069798d811517d9edc5d9"} Apr 16 18:32:00.836702 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.836705 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206","Type":"ContainerStarted","Data":"8bf4bf2abda27a7598145a994a0f51e9231fed93669f90ce38fb761d8611d2c4"} Apr 16 18:32:00.836875 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.836719 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206","Type":"ContainerStarted","Data":"c6dee946f069d69b32ea26cc43327cea7a1dbba90f62efcf435a59c04b54302f"} Apr 16 18:32:00.836875 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.836732 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206","Type":"ContainerStarted","Data":"0344c93126ef7452bafc41b968559453815b13060ba88dc1c4bc47b3d03091cb"} Apr 16 18:32:00.836875 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.836744 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206","Type":"ContainerStarted","Data":"0a6b6cd7b015903a1cb44e083f022f2d8e03b9315d613f43d6ed869e5e4cad99"} Apr 16 18:32:00.838821 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.838794 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2bwhf" event={"ID":"ba95d7cd-292c-41ec-8417-d3768d65716d","Type":"ContainerStarted","Data":"19de563aff2b057f15697d25f06f2731dfeb2b51749d2327e5fba56898b2411f"} Apr 16 18:32:00.838959 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.838934 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:32:00.840830 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.840805 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sjjgw" event={"ID":"642f0536-3a1b-4d5c-bb3d-e7128392b218","Type":"ContainerStarted","Data":"dbf62cb6a2980ef485bf2391276f1e89ea956536d43703109b2c1d6e29e8bfe3"} Apr 16 18:32:00.840941 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.840838 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sjjgw" event={"ID":"642f0536-3a1b-4d5c-bb3d-e7128392b218","Type":"ContainerStarted","Data":"803455f8f4434e40f0c469fed54db211fcad0bd89ef094f1c9a792892ffca366"} Apr 16 18:32:00.842436 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.842415 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-748f7895cf-kxtl9" event={"ID":"62209e8e-6f79-4034-ad78-880129363760","Type":"ContainerStarted","Data":"da3813d85a1bc6452c3f0d191f5c57ab011d1e44b43c51f99ff6bc2d26307ff8"} Apr 16 18:32:00.842545 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.842444 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-748f7895cf-kxtl9" event={"ID":"62209e8e-6f79-4034-ad78-880129363760","Type":"ContainerStarted","Data":"dc1f2474a90cd4e2bf31a60c6dfe8dcf1f8a49c85f72685c3b7323d220c75ad1"} Apr 16 18:32:00.843664 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.843641 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" event={"ID":"e0336541-4641-42a2-9a31-69bf1667218c","Type":"ContainerStarted","Data":"1a7693a5c657545795b1710f656ea16c9caab92a82c71ae1b95288cb457cdc78"} Apr 16 18:32:00.882899 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.882818 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2bwhf" podStartSLOduration=67.475205043 podStartE2EDuration="1m11.882799021s" podCreationTimestamp="2026-04-16 18:30:49 +0000 UTC" firstStartedPulling="2026-04-16 18:31:55.586734435 +0000 UTC m=+66.675998007" lastFinishedPulling="2026-04-16 18:31:59.994328402 +0000 UTC m=+71.083591985" observedRunningTime="2026-04-16 18:32:00.881466317 +0000 UTC m=+71.970729911" watchObservedRunningTime="2026-04-16 18:32:00.882799021 +0000 UTC m=+71.972062614" Apr 16 18:32:00.902859 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.900777 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sjjgw" podStartSLOduration=67.871303626 podStartE2EDuration="1m11.900761427s" podCreationTimestamp="2026-04-16 18:30:49 +0000 UTC" firstStartedPulling="2026-04-16 18:31:55.555645158 +0000 UTC m=+66.644908736" lastFinishedPulling="2026-04-16 18:31:59.585102952 +0000 UTC m=+70.674366537" observedRunningTime="2026-04-16 18:32:00.900506273 +0000 UTC m=+71.989769865" watchObservedRunningTime="2026-04-16 18:32:00.900761427 +0000 UTC m=+71.990025021" Apr 16 18:32:00.928028 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:00.927317 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-748f7895cf-kxtl9" podStartSLOduration=4.92729356 podStartE2EDuration="4.92729356s" podCreationTimestamp="2026-04-16 18:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:32:00.925844547 +0000 UTC m=+72.015108138" watchObservedRunningTime="2026-04-16 18:32:00.92729356 +0000 UTC m=+72.016557154" Apr 16 18:32:02.851014 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:02.850974 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" event={"ID":"e0336541-4641-42a2-9a31-69bf1667218c","Type":"ContainerStarted","Data":"1f7b06e5089bf5cdd2b2ce90642146e0cf3b93a5e38a79e242fe781e4be155e0"} Apr 16 18:32:02.852426 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:02.852397 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j6nsg" event={"ID":"31981e96-805e-41b8-a6dd-0d2bf7af45d8","Type":"ContainerStarted","Data":"c4e2ff6e6a910bde2c0abe1905abc1c8925a03274979f537ee6299704e7d3f8e"} Apr 16 18:32:02.852561 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:02.852520 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j6nsg" Apr 16 18:32:02.855530 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:02.855504 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" event={"ID":"ca31ecdf-a575-4275-9d30-6215d448e698","Type":"ContainerStarted","Data":"3eb8aca07ab8ecd753f34b32fb954ac8f40d138e19798f6ca9a01c062a989a61"} Apr 16 18:32:02.855647 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:02.855536 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" event={"ID":"ca31ecdf-a575-4275-9d30-6215d448e698","Type":"ContainerStarted","Data":"939fb5e9a32492667b684e53ac9f263990a2ded177eac880e49f0a2bb2141211"} Apr 16 18:32:02.855647 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:02.855551 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" event={"ID":"ca31ecdf-a575-4275-9d30-6215d448e698","Type":"ContainerStarted","Data":"e38f7a4a33aa15a6530f14e4bad15154065d09d4f9eb5158b0efb6d14b038a3e"} Apr 16 18:32:02.855647 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:02.855623 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:32:02.858157 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:02.858123 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j6nsg" Apr 16 18:32:02.858571 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:02.858553 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa3404b2-7cf8-4cc7-b029-8cf2ef45e206","Type":"ContainerStarted","Data":"67871604afd5d5db435d0fe3a3374e136288449ce61173a4207e240086976219"} Apr 16 18:32:02.870625 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:02.870585 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" podStartSLOduration=5.157855453 podStartE2EDuration="6.870574687s" podCreationTimestamp="2026-04-16 18:31:56 +0000 UTC" firstStartedPulling="2026-04-16 18:32:00.172488771 +0000 UTC m=+71.261752342" lastFinishedPulling="2026-04-16 18:32:01.885208005 +0000 UTC m=+72.974471576" observedRunningTime="2026-04-16 18:32:02.868349653 +0000 UTC m=+73.957613261" watchObservedRunningTime="2026-04-16 18:32:02.870574687 +0000 UTC m=+73.959838279" Apr 16 18:32:02.898866 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:02.898802 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.622531661 podStartE2EDuration="9.898784244s" podCreationTimestamp="2026-04-16 18:31:53 +0000 UTC" firstStartedPulling="2026-04-16 18:31:54.605072818 +0000 UTC m=+65.694336397" lastFinishedPulling="2026-04-16 18:32:01.881325395 +0000 UTC m=+72.970588980" observedRunningTime="2026-04-16 18:32:02.896913694 +0000 UTC m=+73.986177284" watchObservedRunningTime="2026-04-16 18:32:02.898784244 +0000 UTC m=+73.988047839" Apr 16 18:32:02.921009 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:02.920950 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-j6nsg" podStartSLOduration=5.033961496 podStartE2EDuration="6.920931805s" podCreationTimestamp="2026-04-16 18:31:56 +0000 UTC" firstStartedPulling="2026-04-16 18:31:59.994535093 +0000 UTC m=+71.083798679" lastFinishedPulling="2026-04-16 18:32:01.881505402 +0000 UTC m=+72.970768988" observedRunningTime="2026-04-16 18:32:02.916651407 +0000 UTC m=+74.005915001" watchObservedRunningTime="2026-04-16 18:32:02.920931805 +0000 UTC m=+74.010195399" Apr 16 18:32:02.945810 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:02.945745 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" podStartSLOduration=2.2236129350000002 podStartE2EDuration="8.945725656s" podCreationTimestamp="2026-04-16 18:31:54 +0000 UTC" firstStartedPulling="2026-04-16 18:31:55.15970595 +0000 UTC m=+66.248969521" lastFinishedPulling="2026-04-16 18:32:01.881818668 +0000 UTC m=+72.971082242" observedRunningTime="2026-04-16 18:32:02.942033299 +0000 UTC m=+74.031296896" watchObservedRunningTime="2026-04-16 18:32:02.945725656 +0000 UTC m=+74.034989251" Apr 16 18:32:03.599654 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:03.599617 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:32:03.604285 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:03.604261 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:32:05.870516 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:05.870483 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73d5455b-55b6-44a7-8c92-e363b7d1b2f2","Type":"ContainerStarted","Data":"345fe0ca81949fb6429d2b82de7cdf88ae4db92fb1ede6a1ac2a173e939d4d45"} Apr 16 18:32:05.870516 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:05.870521 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73d5455b-55b6-44a7-8c92-e363b7d1b2f2","Type":"ContainerStarted","Data":"faf2d737670b922ba76f868632e081bb79c8457b6d918165fe82ff707e34e98c"} Apr 16 18:32:05.870940 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:05.870535 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73d5455b-55b6-44a7-8c92-e363b7d1b2f2","Type":"ContainerStarted","Data":"0fd37c6c852e51ae14c89a4429b52725e4056b7e27889328b7e06497835a6dfc"} Apr 16 18:32:05.870940 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:05.870547 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73d5455b-55b6-44a7-8c92-e363b7d1b2f2","Type":"ContainerStarted","Data":"ab132400eb9aa5fcd87a6e84a26cdbe0d3d4058fdb3d8edb7934e9ac11c055c1"} Apr 16 18:32:05.870940 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:05.870557 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73d5455b-55b6-44a7-8c92-e363b7d1b2f2","Type":"ContainerStarted","Data":"7031dbce4f54fb3a17ea2989e0be9a0fbe9baaee8a6d34ec97d91142249071da"} Apr 16 18:32:05.870940 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:05.870568 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73d5455b-55b6-44a7-8c92-e363b7d1b2f2","Type":"ContainerStarted","Data":"3bdd5887a0b1a190154dd6c097fe6a19d3763193922f485082fafcb19fecf353"} Apr 16 18:32:05.913191 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:05.913115 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.78006011 podStartE2EDuration="7.913098641s" podCreationTimestamp="2026-04-16 18:31:58 +0000 UTC" firstStartedPulling="2026-04-16 18:32:00.829592745 +0000 UTC m=+71.918856315" lastFinishedPulling="2026-04-16 18:32:04.962631272 +0000 UTC m=+76.051894846" observedRunningTime="2026-04-16 18:32:05.910525413 +0000 UTC m=+76.999789006" watchObservedRunningTime="2026-04-16 18:32:05.913098641 +0000 UTC m=+77.002362226" Apr 16 18:32:07.370468 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:07.370435 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:32:07.370816 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:07.370509 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:32:07.375983 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:07.375960 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:32:07.879722 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:07.879695 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:32:07.953553 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:07.953522 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c8f9bdf47-kv598"] Apr 16 18:32:08.868357 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:08.868323 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-d54fc9d9-x7l2k" Apr 16 18:32:08.952616 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:08.952589 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:32:17.051232 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:17.051189 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:32:17.051232 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:17.051239 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:32:22.014916 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.014873 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-678f445d6c-ps4hz" podUID="96663b4b-e1a0-4eee-bbf7-c53a50536f00" containerName="console" containerID="cri-o://de4f2f14819c9b2876f4a3d6a62b83a452451090964a04d324674492e153454e" gracePeriod=15 Apr 16 18:32:22.252443 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.252422 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-678f445d6c-ps4hz_96663b4b-e1a0-4eee-bbf7-c53a50536f00/console/0.log" Apr 16 18:32:22.252573 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.252487 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:32:22.280137 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.280052 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfc44\" (UniqueName: \"kubernetes.io/projected/96663b4b-e1a0-4eee-bbf7-c53a50536f00-kube-api-access-lfc44\") pod \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " Apr 16 18:32:22.280137 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.280092 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-oauth-serving-cert\") pod \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " Apr 16 18:32:22.280358 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.280187 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-config\") pod \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " Apr 16 18:32:22.280358 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.280212 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-service-ca\") pod \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " Apr 16 18:32:22.280358 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.280257 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-serving-cert\") pod \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " Apr 16 18:32:22.280358 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.280278 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-oauth-config\") pod \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\" (UID: \"96663b4b-e1a0-4eee-bbf7-c53a50536f00\") " Apr 16 18:32:22.280568 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.280542 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "96663b4b-e1a0-4eee-bbf7-c53a50536f00" (UID: "96663b4b-e1a0-4eee-bbf7-c53a50536f00"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:32:22.280630 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.280606 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-config" (OuterVolumeSpecName: "console-config") pod "96663b4b-e1a0-4eee-bbf7-c53a50536f00" (UID: "96663b4b-e1a0-4eee-bbf7-c53a50536f00"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:32:22.280675 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.280611 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-service-ca" (OuterVolumeSpecName: "service-ca") pod "96663b4b-e1a0-4eee-bbf7-c53a50536f00" (UID: "96663b4b-e1a0-4eee-bbf7-c53a50536f00"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:32:22.282545 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.282510 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "96663b4b-e1a0-4eee-bbf7-c53a50536f00" (UID: "96663b4b-e1a0-4eee-bbf7-c53a50536f00"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:32:22.282669 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.282581 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96663b4b-e1a0-4eee-bbf7-c53a50536f00-kube-api-access-lfc44" (OuterVolumeSpecName: "kube-api-access-lfc44") pod "96663b4b-e1a0-4eee-bbf7-c53a50536f00" (UID: "96663b4b-e1a0-4eee-bbf7-c53a50536f00"). InnerVolumeSpecName "kube-api-access-lfc44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:32:22.282669 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.282589 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "96663b4b-e1a0-4eee-bbf7-c53a50536f00" (UID: "96663b4b-e1a0-4eee-bbf7-c53a50536f00"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:32:22.381857 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.381810 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-config\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:32:22.381857 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.381849 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-service-ca\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:32:22.381857 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.381862 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-serving-cert\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:32:22.382104 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.381878 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96663b4b-e1a0-4eee-bbf7-c53a50536f00-console-oauth-config\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:32:22.382104 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.381891 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lfc44\" (UniqueName: \"kubernetes.io/projected/96663b4b-e1a0-4eee-bbf7-c53a50536f00-kube-api-access-lfc44\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:32:22.382104 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.381904 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96663b4b-e1a0-4eee-bbf7-c53a50536f00-oauth-serving-cert\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:32:22.923163 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.923128 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-678f445d6c-ps4hz_96663b4b-e1a0-4eee-bbf7-c53a50536f00/console/0.log" Apr 16 18:32:22.923359 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.923192 2580 generic.go:358] "Generic (PLEG): container finished" podID="96663b4b-e1a0-4eee-bbf7-c53a50536f00" containerID="de4f2f14819c9b2876f4a3d6a62b83a452451090964a04d324674492e153454e" exitCode=2 Apr 16 18:32:22.923359 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.923225 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-678f445d6c-ps4hz" event={"ID":"96663b4b-e1a0-4eee-bbf7-c53a50536f00","Type":"ContainerDied","Data":"de4f2f14819c9b2876f4a3d6a62b83a452451090964a04d324674492e153454e"} Apr 16 18:32:22.923359 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.923254 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-678f445d6c-ps4hz" Apr 16 18:32:22.923359 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.923269 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-678f445d6c-ps4hz" event={"ID":"96663b4b-e1a0-4eee-bbf7-c53a50536f00","Type":"ContainerDied","Data":"a9badc9d776cfae2557ad1e9f72d0f73145dea4ab87fe3d276d6a3a72590a7ed"} Apr 16 18:32:22.923359 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.923289 2580 scope.go:117] "RemoveContainer" containerID="de4f2f14819c9b2876f4a3d6a62b83a452451090964a04d324674492e153454e" Apr 16 18:32:22.934639 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.933011 2580 scope.go:117] "RemoveContainer" containerID="de4f2f14819c9b2876f4a3d6a62b83a452451090964a04d324674492e153454e" Apr 16 18:32:22.934639 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:32:22.934594 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de4f2f14819c9b2876f4a3d6a62b83a452451090964a04d324674492e153454e\": container with ID starting with de4f2f14819c9b2876f4a3d6a62b83a452451090964a04d324674492e153454e not found: ID does not exist" containerID="de4f2f14819c9b2876f4a3d6a62b83a452451090964a04d324674492e153454e" Apr 16 18:32:22.934806 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.934619 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4f2f14819c9b2876f4a3d6a62b83a452451090964a04d324674492e153454e"} err="failed to get container status \"de4f2f14819c9b2876f4a3d6a62b83a452451090964a04d324674492e153454e\": rpc error: code = NotFound desc = could not find container \"de4f2f14819c9b2876f4a3d6a62b83a452451090964a04d324674492e153454e\": container with ID starting with de4f2f14819c9b2876f4a3d6a62b83a452451090964a04d324674492e153454e not found: ID does not exist" Apr 16 18:32:22.945393 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.945364 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-678f445d6c-ps4hz"] Apr 16 18:32:22.949056 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:22.949038 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-678f445d6c-ps4hz"] Apr 16 18:32:23.539454 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:23.539421 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96663b4b-e1a0-4eee-bbf7-c53a50536f00" path="/var/lib/kubelet/pods/96663b4b-e1a0-4eee-bbf7-c53a50536f00/volumes" Apr 16 18:32:31.849755 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:31.849720 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2bwhf" Apr 16 18:32:32.975100 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:32.975065 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c8f9bdf47-kv598" podUID="5b5a2164-270c-45a5-a588-d6ae77297d86" containerName="console" containerID="cri-o://7cecb7c3ac7f9f1fcdd018b84ebe8d994cdd6372b07f1b33c3dcbbe00279cb6b" gracePeriod=15 Apr 16 18:32:33.251356 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.251332 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c8f9bdf47-kv598_5b5a2164-270c-45a5-a588-d6ae77297d86/console/0.log" Apr 16 18:32:33.251510 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.251396 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:32:33.279131 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.279101 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-oauth-serving-cert\") pod \"5b5a2164-270c-45a5-a588-d6ae77297d86\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " Apr 16 18:32:33.279328 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.279166 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9hsg\" (UniqueName: \"kubernetes.io/projected/5b5a2164-270c-45a5-a588-d6ae77297d86-kube-api-access-m9hsg\") pod \"5b5a2164-270c-45a5-a588-d6ae77297d86\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " Apr 16 18:32:33.279511 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.279361 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b5a2164-270c-45a5-a588-d6ae77297d86-console-oauth-config\") pod \"5b5a2164-270c-45a5-a588-d6ae77297d86\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " Apr 16 18:32:33.279621 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.279543 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-trusted-ca-bundle\") pod \"5b5a2164-270c-45a5-a588-d6ae77297d86\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " Apr 16 18:32:33.279621 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.279551 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5b5a2164-270c-45a5-a588-d6ae77297d86" (UID: "5b5a2164-270c-45a5-a588-d6ae77297d86"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:32:33.279621 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.279598 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-service-ca\") pod \"5b5a2164-270c-45a5-a588-d6ae77297d86\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " Apr 16 18:32:33.279785 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.279647 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b5a2164-270c-45a5-a588-d6ae77297d86-console-serving-cert\") pod \"5b5a2164-270c-45a5-a588-d6ae77297d86\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " Apr 16 18:32:33.279785 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.279683 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-console-config\") pod \"5b5a2164-270c-45a5-a588-d6ae77297d86\" (UID: \"5b5a2164-270c-45a5-a588-d6ae77297d86\") " Apr 16 18:32:33.279972 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.279952 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-oauth-serving-cert\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:32:33.280063 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.280014 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-service-ca" (OuterVolumeSpecName: "service-ca") pod "5b5a2164-270c-45a5-a588-d6ae77297d86" (UID: "5b5a2164-270c-45a5-a588-d6ae77297d86"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:32:33.280063 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.280031 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5b5a2164-270c-45a5-a588-d6ae77297d86" (UID: "5b5a2164-270c-45a5-a588-d6ae77297d86"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:32:33.280417 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.280394 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-console-config" (OuterVolumeSpecName: "console-config") pod "5b5a2164-270c-45a5-a588-d6ae77297d86" (UID: "5b5a2164-270c-45a5-a588-d6ae77297d86"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:32:33.281654 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.281631 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b5a2164-270c-45a5-a588-d6ae77297d86-kube-api-access-m9hsg" (OuterVolumeSpecName: "kube-api-access-m9hsg") pod "5b5a2164-270c-45a5-a588-d6ae77297d86" (UID: "5b5a2164-270c-45a5-a588-d6ae77297d86"). InnerVolumeSpecName "kube-api-access-m9hsg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:32:33.282441 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.282404 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b5a2164-270c-45a5-a588-d6ae77297d86-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5b5a2164-270c-45a5-a588-d6ae77297d86" (UID: "5b5a2164-270c-45a5-a588-d6ae77297d86"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:32:33.282528 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.282486 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b5a2164-270c-45a5-a588-d6ae77297d86-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5b5a2164-270c-45a5-a588-d6ae77297d86" (UID: "5b5a2164-270c-45a5-a588-d6ae77297d86"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:32:33.380737 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.380704 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-service-ca\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:32:33.380737 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.380732 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b5a2164-270c-45a5-a588-d6ae77297d86-console-serving-cert\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:32:33.380737 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.380742 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-console-config\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:32:33.380737 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.380751 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m9hsg\" (UniqueName: \"kubernetes.io/projected/5b5a2164-270c-45a5-a588-d6ae77297d86-kube-api-access-m9hsg\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:32:33.380999 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.380761 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b5a2164-270c-45a5-a588-d6ae77297d86-console-oauth-config\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:32:33.380999 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.380769 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b5a2164-270c-45a5-a588-d6ae77297d86-trusted-ca-bundle\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:32:33.956189 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.956099 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c8f9bdf47-kv598_5b5a2164-270c-45a5-a588-d6ae77297d86/console/0.log" Apr 16 18:32:33.956189 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.956138 2580 generic.go:358] "Generic (PLEG): container finished" podID="5b5a2164-270c-45a5-a588-d6ae77297d86" containerID="7cecb7c3ac7f9f1fcdd018b84ebe8d994cdd6372b07f1b33c3dcbbe00279cb6b" exitCode=2 Apr 16 18:32:33.956368 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.956230 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c8f9bdf47-kv598" Apr 16 18:32:33.956368 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.956232 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c8f9bdf47-kv598" event={"ID":"5b5a2164-270c-45a5-a588-d6ae77297d86","Type":"ContainerDied","Data":"7cecb7c3ac7f9f1fcdd018b84ebe8d994cdd6372b07f1b33c3dcbbe00279cb6b"} Apr 16 18:32:33.956368 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.956333 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c8f9bdf47-kv598" event={"ID":"5b5a2164-270c-45a5-a588-d6ae77297d86","Type":"ContainerDied","Data":"f3956496fdfadb38aaddaf161347f14df418a1404b604cd38a82afb16bed454b"} Apr 16 18:32:33.956368 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.956348 2580 scope.go:117] "RemoveContainer" containerID="7cecb7c3ac7f9f1fcdd018b84ebe8d994cdd6372b07f1b33c3dcbbe00279cb6b" Apr 16 18:32:33.964230 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.964197 2580 scope.go:117] "RemoveContainer" containerID="7cecb7c3ac7f9f1fcdd018b84ebe8d994cdd6372b07f1b33c3dcbbe00279cb6b" Apr 16 18:32:33.964475 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:32:33.964456 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cecb7c3ac7f9f1fcdd018b84ebe8d994cdd6372b07f1b33c3dcbbe00279cb6b\": container with ID starting with 7cecb7c3ac7f9f1fcdd018b84ebe8d994cdd6372b07f1b33c3dcbbe00279cb6b not found: ID does not exist" containerID="7cecb7c3ac7f9f1fcdd018b84ebe8d994cdd6372b07f1b33c3dcbbe00279cb6b" Apr 16 18:32:33.964525 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.964485 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cecb7c3ac7f9f1fcdd018b84ebe8d994cdd6372b07f1b33c3dcbbe00279cb6b"} err="failed to get container status \"7cecb7c3ac7f9f1fcdd018b84ebe8d994cdd6372b07f1b33c3dcbbe00279cb6b\": rpc error: code = NotFound desc = could not find container \"7cecb7c3ac7f9f1fcdd018b84ebe8d994cdd6372b07f1b33c3dcbbe00279cb6b\": container with ID starting with 7cecb7c3ac7f9f1fcdd018b84ebe8d994cdd6372b07f1b33c3dcbbe00279cb6b not found: ID does not exist" Apr 16 18:32:33.989715 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.989677 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c8f9bdf47-kv598"] Apr 16 18:32:33.999336 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:33.999312 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c8f9bdf47-kv598"] Apr 16 18:32:35.539363 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:35.539327 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b5a2164-270c-45a5-a588-d6ae77297d86" path="/var/lib/kubelet/pods/5b5a2164-270c-45a5-a588-d6ae77297d86/volumes" Apr 16 18:32:37.056663 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:37.056634 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:32:37.060573 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:37.060548 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5776bc855b-mrbhh" Apr 16 18:32:58.952545 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:58.952503 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:32:58.973035 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:58.973010 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:32:59.047109 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:32:59.047076 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.678435 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.678403 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56cb865b56-2jjbw"] Apr 16 18:33:20.679073 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.678779 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b5a2164-270c-45a5-a588-d6ae77297d86" containerName="console" Apr 16 18:33:20.679073 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.678799 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5a2164-270c-45a5-a588-d6ae77297d86" containerName="console" Apr 16 18:33:20.679073 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.678837 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96663b4b-e1a0-4eee-bbf7-c53a50536f00" containerName="console" Apr 16 18:33:20.679073 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.678844 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="96663b4b-e1a0-4eee-bbf7-c53a50536f00" containerName="console" Apr 16 18:33:20.679073 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.678914 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="96663b4b-e1a0-4eee-bbf7-c53a50536f00" containerName="console" Apr 16 18:33:20.679073 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.678927 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b5a2164-270c-45a5-a588-d6ae77297d86" containerName="console" Apr 16 18:33:20.683569 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.683547 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.693866 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.693845 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56cb865b56-2jjbw"] Apr 16 18:33:20.804022 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.803978 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-trusted-ca-bundle\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.804376 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.804353 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5552s\" (UniqueName: \"kubernetes.io/projected/560cc19f-7a6f-44e1-9dc5-339e345023b6-kube-api-access-5552s\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.804556 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.804543 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-oauth-config\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.804697 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.804685 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-oauth-serving-cert\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.804831 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.804819 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-config\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.804947 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.804935 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-service-ca\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.805069 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.805056 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-serving-cert\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.906548 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.906507 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-config\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.906745 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.906555 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-service-ca\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.906745 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.906601 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-serving-cert\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.906745 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.906635 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-trusted-ca-bundle\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.906745 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.906672 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5552s\" (UniqueName: \"kubernetes.io/projected/560cc19f-7a6f-44e1-9dc5-339e345023b6-kube-api-access-5552s\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.906745 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.906724 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-oauth-config\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.906990 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.906766 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-oauth-serving-cert\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.907417 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.907387 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-service-ca\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.907534 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.907519 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-config\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.907595 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.907530 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-oauth-serving-cert\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.907666 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.907643 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-trusted-ca-bundle\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.909293 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.909264 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-oauth-config\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.909430 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.909356 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-serving-cert\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.920044 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.920014 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5552s\" (UniqueName: \"kubernetes.io/projected/560cc19f-7a6f-44e1-9dc5-339e345023b6-kube-api-access-5552s\") pod \"console-56cb865b56-2jjbw\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:20.993484 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:20.993440 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:21.133914 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:21.133889 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56cb865b56-2jjbw"] Apr 16 18:33:21.136999 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:33:21.136959 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod560cc19f_7a6f_44e1_9dc5_339e345023b6.slice/crio-17ed91b6c58c8a006929a31fa19309c622adb59dd36e414be0bb36eb740e955f WatchSource:0}: Error finding container 17ed91b6c58c8a006929a31fa19309c622adb59dd36e414be0bb36eb740e955f: Status 404 returned error can't find the container with id 17ed91b6c58c8a006929a31fa19309c622adb59dd36e414be0bb36eb740e955f Apr 16 18:33:22.103582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:22.103547 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56cb865b56-2jjbw" event={"ID":"560cc19f-7a6f-44e1-9dc5-339e345023b6","Type":"ContainerStarted","Data":"ee742745f5edd190c2d98de4526ad4c4a204594b249ec6c09401ae75dab8ba19"} Apr 16 18:33:22.103582 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:22.103584 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56cb865b56-2jjbw" event={"ID":"560cc19f-7a6f-44e1-9dc5-339e345023b6","Type":"ContainerStarted","Data":"17ed91b6c58c8a006929a31fa19309c622adb59dd36e414be0bb36eb740e955f"} Apr 16 18:33:30.994332 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:30.994220 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:30.994332 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:30.994288 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:30.998981 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:30.998961 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:31.018706 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.018664 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56cb865b56-2jjbw" podStartSLOduration=11.018651501 podStartE2EDuration="11.018651501s" podCreationTimestamp="2026-04-16 18:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:33:22.129553656 +0000 UTC m=+153.218817251" watchObservedRunningTime="2026-04-16 18:33:31.018651501 +0000 UTC m=+162.107915094" Apr 16 18:33:31.133134 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.133106 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:33:31.206577 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.206544 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-748f7895cf-kxtl9"] Apr 16 18:33:31.333681 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.333603 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-89ccddb4-ql8kd"] Apr 16 18:33:31.336711 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.336682 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.349837 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.349814 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-89ccddb4-ql8kd"] Apr 16 18:33:31.402171 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.402116 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-config\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.402324 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.402201 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-oauth-config\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.402324 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.402220 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9jk6\" (UniqueName: \"kubernetes.io/projected/2e17c4e0-04ae-49d0-8c52-562cf1661296-kube-api-access-w9jk6\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.402402 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.402353 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-trusted-ca-bundle\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.402402 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.402380 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-oauth-serving-cert\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.402481 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.402406 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-serving-cert\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.402481 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.402437 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-service-ca\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.503390 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.503351 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-service-ca\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.503575 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.503403 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-config\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.503575 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.503431 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-oauth-config\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.503575 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.503447 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9jk6\" (UniqueName: \"kubernetes.io/projected/2e17c4e0-04ae-49d0-8c52-562cf1661296-kube-api-access-w9jk6\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.503575 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.503542 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-trusted-ca-bundle\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.503575 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.503567 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-oauth-serving-cert\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.503842 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.503592 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-serving-cert\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.504189 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.504140 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-config\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.504277 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.504140 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-service-ca\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.504318 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.504308 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-oauth-serving-cert\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.504653 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.504629 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-trusted-ca-bundle\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.506029 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.506005 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-oauth-config\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.506178 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.506133 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-serving-cert\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.512266 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.512245 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9jk6\" (UniqueName: \"kubernetes.io/projected/2e17c4e0-04ae-49d0-8c52-562cf1661296-kube-api-access-w9jk6\") pod \"console-89ccddb4-ql8kd\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.645820 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.645726 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:31.769665 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:31.769640 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-89ccddb4-ql8kd"] Apr 16 18:33:31.771647 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:33:31.771617 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e17c4e0_04ae_49d0_8c52_562cf1661296.slice/crio-8f19d1f4a66506536aaa4897198536b5cea900982b72a82dd1f55bc897cdb08e WatchSource:0}: Error finding container 8f19d1f4a66506536aaa4897198536b5cea900982b72a82dd1f55bc897cdb08e: Status 404 returned error can't find the container with id 8f19d1f4a66506536aaa4897198536b5cea900982b72a82dd1f55bc897cdb08e Apr 16 18:33:32.133667 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:32.133626 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-89ccddb4-ql8kd" event={"ID":"2e17c4e0-04ae-49d0-8c52-562cf1661296","Type":"ContainerStarted","Data":"33f45120441c431e9033b5a1491eb374d450f618cb1b17af816dbc1e46073000"} Apr 16 18:33:32.133667 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:32.133670 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-89ccddb4-ql8kd" event={"ID":"2e17c4e0-04ae-49d0-8c52-562cf1661296","Type":"ContainerStarted","Data":"8f19d1f4a66506536aaa4897198536b5cea900982b72a82dd1f55bc897cdb08e"} Apr 16 18:33:32.155137 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:32.155090 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-89ccddb4-ql8kd" podStartSLOduration=1.155075606 podStartE2EDuration="1.155075606s" podCreationTimestamp="2026-04-16 18:33:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:33:32.153133503 +0000 UTC m=+163.242397097" watchObservedRunningTime="2026-04-16 18:33:32.155075606 +0000 UTC m=+163.244339202" Apr 16 18:33:41.646288 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:41.646252 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:41.646288 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:41.646292 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:41.650924 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:41.650902 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:42.167463 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:42.167428 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:33:42.219993 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:42.219963 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56cb865b56-2jjbw"] Apr 16 18:33:56.225305 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.225241 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-748f7895cf-kxtl9" podUID="62209e8e-6f79-4034-ad78-880129363760" containerName="console" containerID="cri-o://da3813d85a1bc6452c3f0d191f5c57ab011d1e44b43c51f99ff6bc2d26307ff8" gracePeriod=15 Apr 16 18:33:56.465507 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.465480 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-748f7895cf-kxtl9_62209e8e-6f79-4034-ad78-880129363760/console/0.log" Apr 16 18:33:56.465658 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.465559 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:33:56.526298 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.526201 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-oauth-serving-cert\") pod \"62209e8e-6f79-4034-ad78-880129363760\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " Apr 16 18:33:56.526298 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.526254 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62209e8e-6f79-4034-ad78-880129363760-console-serving-cert\") pod \"62209e8e-6f79-4034-ad78-880129363760\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " Apr 16 18:33:56.526298 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.526294 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62209e8e-6f79-4034-ad78-880129363760-console-oauth-config\") pod \"62209e8e-6f79-4034-ad78-880129363760\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " Apr 16 18:33:56.526566 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.526324 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-trusted-ca-bundle\") pod \"62209e8e-6f79-4034-ad78-880129363760\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " Apr 16 18:33:56.526566 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.526352 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thv8l\" (UniqueName: \"kubernetes.io/projected/62209e8e-6f79-4034-ad78-880129363760-kube-api-access-thv8l\") pod \"62209e8e-6f79-4034-ad78-880129363760\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " Apr 16 18:33:56.526566 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.526397 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-service-ca\") pod \"62209e8e-6f79-4034-ad78-880129363760\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " Apr 16 18:33:56.526566 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.526419 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-console-config\") pod \"62209e8e-6f79-4034-ad78-880129363760\" (UID: \"62209e8e-6f79-4034-ad78-880129363760\") " Apr 16 18:33:56.526763 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.526687 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "62209e8e-6f79-4034-ad78-880129363760" (UID: "62209e8e-6f79-4034-ad78-880129363760"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:33:56.526910 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.526827 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "62209e8e-6f79-4034-ad78-880129363760" (UID: "62209e8e-6f79-4034-ad78-880129363760"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:33:56.527019 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.526921 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-service-ca" (OuterVolumeSpecName: "service-ca") pod "62209e8e-6f79-4034-ad78-880129363760" (UID: "62209e8e-6f79-4034-ad78-880129363760"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:33:56.527019 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.526982 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-console-config" (OuterVolumeSpecName: "console-config") pod "62209e8e-6f79-4034-ad78-880129363760" (UID: "62209e8e-6f79-4034-ad78-880129363760"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:33:56.528651 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.528620 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62209e8e-6f79-4034-ad78-880129363760-kube-api-access-thv8l" (OuterVolumeSpecName: "kube-api-access-thv8l") pod "62209e8e-6f79-4034-ad78-880129363760" (UID: "62209e8e-6f79-4034-ad78-880129363760"). InnerVolumeSpecName "kube-api-access-thv8l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:33:56.528756 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.528658 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62209e8e-6f79-4034-ad78-880129363760-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "62209e8e-6f79-4034-ad78-880129363760" (UID: "62209e8e-6f79-4034-ad78-880129363760"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:33:56.528756 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.528686 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62209e8e-6f79-4034-ad78-880129363760-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "62209e8e-6f79-4034-ad78-880129363760" (UID: "62209e8e-6f79-4034-ad78-880129363760"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:33:56.627469 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.627427 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62209e8e-6f79-4034-ad78-880129363760-console-serving-cert\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:33:56.627469 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.627459 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62209e8e-6f79-4034-ad78-880129363760-console-oauth-config\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:33:56.627469 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.627469 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-trusted-ca-bundle\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:33:56.627711 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.627483 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-thv8l\" (UniqueName: \"kubernetes.io/projected/62209e8e-6f79-4034-ad78-880129363760-kube-api-access-thv8l\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:33:56.627711 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.627497 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-service-ca\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:33:56.627711 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.627511 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-console-config\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:33:56.627711 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:56.627523 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62209e8e-6f79-4034-ad78-880129363760-oauth-serving-cert\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:33:57.208086 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:57.208057 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-748f7895cf-kxtl9_62209e8e-6f79-4034-ad78-880129363760/console/0.log" Apr 16 18:33:57.208276 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:57.208101 2580 generic.go:358] "Generic (PLEG): container finished" podID="62209e8e-6f79-4034-ad78-880129363760" containerID="da3813d85a1bc6452c3f0d191f5c57ab011d1e44b43c51f99ff6bc2d26307ff8" exitCode=2 Apr 16 18:33:57.208276 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:57.208196 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-748f7895cf-kxtl9" event={"ID":"62209e8e-6f79-4034-ad78-880129363760","Type":"ContainerDied","Data":"da3813d85a1bc6452c3f0d191f5c57ab011d1e44b43c51f99ff6bc2d26307ff8"} Apr 16 18:33:57.208276 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:57.208225 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-748f7895cf-kxtl9" Apr 16 18:33:57.208276 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:57.208245 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-748f7895cf-kxtl9" event={"ID":"62209e8e-6f79-4034-ad78-880129363760","Type":"ContainerDied","Data":"dc1f2474a90cd4e2bf31a60c6dfe8dcf1f8a49c85f72685c3b7323d220c75ad1"} Apr 16 18:33:57.208276 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:57.208261 2580 scope.go:117] "RemoveContainer" containerID="da3813d85a1bc6452c3f0d191f5c57ab011d1e44b43c51f99ff6bc2d26307ff8" Apr 16 18:33:57.216638 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:57.216621 2580 scope.go:117] "RemoveContainer" containerID="da3813d85a1bc6452c3f0d191f5c57ab011d1e44b43c51f99ff6bc2d26307ff8" Apr 16 18:33:57.216928 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:33:57.216908 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da3813d85a1bc6452c3f0d191f5c57ab011d1e44b43c51f99ff6bc2d26307ff8\": container with ID starting with da3813d85a1bc6452c3f0d191f5c57ab011d1e44b43c51f99ff6bc2d26307ff8 not found: ID does not exist" containerID="da3813d85a1bc6452c3f0d191f5c57ab011d1e44b43c51f99ff6bc2d26307ff8" Apr 16 18:33:57.216981 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:57.216937 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da3813d85a1bc6452c3f0d191f5c57ab011d1e44b43c51f99ff6bc2d26307ff8"} err="failed to get container status \"da3813d85a1bc6452c3f0d191f5c57ab011d1e44b43c51f99ff6bc2d26307ff8\": rpc error: code = NotFound desc = could not find container \"da3813d85a1bc6452c3f0d191f5c57ab011d1e44b43c51f99ff6bc2d26307ff8\": container with ID starting with da3813d85a1bc6452c3f0d191f5c57ab011d1e44b43c51f99ff6bc2d26307ff8 not found: ID does not exist" Apr 16 18:33:57.235176 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:57.235124 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-748f7895cf-kxtl9"] Apr 16 18:33:57.243703 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:57.243675 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-748f7895cf-kxtl9"] Apr 16 18:33:57.538903 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:33:57.538868 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62209e8e-6f79-4034-ad78-880129363760" path="/var/lib/kubelet/pods/62209e8e-6f79-4034-ad78-880129363760/volumes" Apr 16 18:34:07.240202 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.240130 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56cb865b56-2jjbw" podUID="560cc19f-7a6f-44e1-9dc5-339e345023b6" containerName="console" containerID="cri-o://ee742745f5edd190c2d98de4526ad4c4a204594b249ec6c09401ae75dab8ba19" gracePeriod=15 Apr 16 18:34:07.487348 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.487313 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56cb865b56-2jjbw_560cc19f-7a6f-44e1-9dc5-339e345023b6/console/0.log" Apr 16 18:34:07.487489 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.487399 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:34:07.520299 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.520219 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-trusted-ca-bundle\") pod \"560cc19f-7a6f-44e1-9dc5-339e345023b6\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " Apr 16 18:34:07.520299 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.520254 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-config\") pod \"560cc19f-7a6f-44e1-9dc5-339e345023b6\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " Apr 16 18:34:07.520510 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.520305 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-service-ca\") pod \"560cc19f-7a6f-44e1-9dc5-339e345023b6\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " Apr 16 18:34:07.520510 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.520414 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-serving-cert\") pod \"560cc19f-7a6f-44e1-9dc5-339e345023b6\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " Apr 16 18:34:07.520510 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.520481 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-oauth-serving-cert\") pod \"560cc19f-7a6f-44e1-9dc5-339e345023b6\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " Apr 16 18:34:07.520667 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.520538 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-oauth-config\") pod \"560cc19f-7a6f-44e1-9dc5-339e345023b6\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " Apr 16 18:34:07.520667 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.520651 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-service-ca" (OuterVolumeSpecName: "service-ca") pod "560cc19f-7a6f-44e1-9dc5-339e345023b6" (UID: "560cc19f-7a6f-44e1-9dc5-339e345023b6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:07.520767 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.520700 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "560cc19f-7a6f-44e1-9dc5-339e345023b6" (UID: "560cc19f-7a6f-44e1-9dc5-339e345023b6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:07.520827 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.520757 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-config" (OuterVolumeSpecName: "console-config") pod "560cc19f-7a6f-44e1-9dc5-339e345023b6" (UID: "560cc19f-7a6f-44e1-9dc5-339e345023b6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:07.520827 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.520800 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5552s\" (UniqueName: \"kubernetes.io/projected/560cc19f-7a6f-44e1-9dc5-339e345023b6-kube-api-access-5552s\") pod \"560cc19f-7a6f-44e1-9dc5-339e345023b6\" (UID: \"560cc19f-7a6f-44e1-9dc5-339e345023b6\") " Apr 16 18:34:07.520951 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.520927 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "560cc19f-7a6f-44e1-9dc5-339e345023b6" (UID: "560cc19f-7a6f-44e1-9dc5-339e345023b6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:07.521102 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.521075 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-service-ca\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:34:07.521102 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.521094 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-oauth-serving-cert\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:34:07.521255 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.521107 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-trusted-ca-bundle\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:34:07.521255 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.521117 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-config\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:34:07.522719 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.522686 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "560cc19f-7a6f-44e1-9dc5-339e345023b6" (UID: "560cc19f-7a6f-44e1-9dc5-339e345023b6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:07.522808 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.522745 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/560cc19f-7a6f-44e1-9dc5-339e345023b6-kube-api-access-5552s" (OuterVolumeSpecName: "kube-api-access-5552s") pod "560cc19f-7a6f-44e1-9dc5-339e345023b6" (UID: "560cc19f-7a6f-44e1-9dc5-339e345023b6"). InnerVolumeSpecName "kube-api-access-5552s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:34:07.522977 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.522959 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "560cc19f-7a6f-44e1-9dc5-339e345023b6" (UID: "560cc19f-7a6f-44e1-9dc5-339e345023b6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:07.622357 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.622319 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-serving-cert\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:34:07.622357 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.622353 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/560cc19f-7a6f-44e1-9dc5-339e345023b6-console-oauth-config\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:34:07.622357 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:07.622363 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5552s\" (UniqueName: \"kubernetes.io/projected/560cc19f-7a6f-44e1-9dc5-339e345023b6-kube-api-access-5552s\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:34:08.248377 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:08.248347 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56cb865b56-2jjbw_560cc19f-7a6f-44e1-9dc5-339e345023b6/console/0.log" Apr 16 18:34:08.248774 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:08.248388 2580 generic.go:358] "Generic (PLEG): container finished" podID="560cc19f-7a6f-44e1-9dc5-339e345023b6" containerID="ee742745f5edd190c2d98de4526ad4c4a204594b249ec6c09401ae75dab8ba19" exitCode=2 Apr 16 18:34:08.248774 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:08.248419 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56cb865b56-2jjbw" event={"ID":"560cc19f-7a6f-44e1-9dc5-339e345023b6","Type":"ContainerDied","Data":"ee742745f5edd190c2d98de4526ad4c4a204594b249ec6c09401ae75dab8ba19"} Apr 16 18:34:08.248774 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:08.248456 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56cb865b56-2jjbw" event={"ID":"560cc19f-7a6f-44e1-9dc5-339e345023b6","Type":"ContainerDied","Data":"17ed91b6c58c8a006929a31fa19309c622adb59dd36e414be0bb36eb740e955f"} Apr 16 18:34:08.248774 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:08.248475 2580 scope.go:117] "RemoveContainer" containerID="ee742745f5edd190c2d98de4526ad4c4a204594b249ec6c09401ae75dab8ba19" Apr 16 18:34:08.248774 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:08.248472 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56cb865b56-2jjbw" Apr 16 18:34:08.256251 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:08.256233 2580 scope.go:117] "RemoveContainer" containerID="ee742745f5edd190c2d98de4526ad4c4a204594b249ec6c09401ae75dab8ba19" Apr 16 18:34:08.256490 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:34:08.256470 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee742745f5edd190c2d98de4526ad4c4a204594b249ec6c09401ae75dab8ba19\": container with ID starting with ee742745f5edd190c2d98de4526ad4c4a204594b249ec6c09401ae75dab8ba19 not found: ID does not exist" containerID="ee742745f5edd190c2d98de4526ad4c4a204594b249ec6c09401ae75dab8ba19" Apr 16 18:34:08.256526 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:08.256501 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee742745f5edd190c2d98de4526ad4c4a204594b249ec6c09401ae75dab8ba19"} err="failed to get container status \"ee742745f5edd190c2d98de4526ad4c4a204594b249ec6c09401ae75dab8ba19\": rpc error: code = NotFound desc = could not find container \"ee742745f5edd190c2d98de4526ad4c4a204594b249ec6c09401ae75dab8ba19\": container with ID starting with ee742745f5edd190c2d98de4526ad4c4a204594b249ec6c09401ae75dab8ba19 not found: ID does not exist" Apr 16 18:34:08.267001 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:08.266977 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56cb865b56-2jjbw"] Apr 16 18:34:08.270419 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:08.270398 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56cb865b56-2jjbw"] Apr 16 18:34:09.538954 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:34:09.538919 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="560cc19f-7a6f-44e1-9dc5-339e345023b6" path="/var/lib/kubelet/pods/560cc19f-7a6f-44e1-9dc5-339e345023b6/volumes" Apr 16 18:35:49.414461 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:35:49.414435 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:36:08.079818 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.079783 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-hlhl2"] Apr 16 18:36:08.081910 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.080112 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="560cc19f-7a6f-44e1-9dc5-339e345023b6" containerName="console" Apr 16 18:36:08.081910 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.080123 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="560cc19f-7a6f-44e1-9dc5-339e345023b6" containerName="console" Apr 16 18:36:08.081910 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.080136 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62209e8e-6f79-4034-ad78-880129363760" containerName="console" Apr 16 18:36:08.081910 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.080159 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="62209e8e-6f79-4034-ad78-880129363760" containerName="console" Apr 16 18:36:08.081910 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.080222 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="62209e8e-6f79-4034-ad78-880129363760" containerName="console" Apr 16 18:36:08.081910 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.080232 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="560cc19f-7a6f-44e1-9dc5-339e345023b6" containerName="console" Apr 16 18:36:08.083077 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.083062 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-hlhl2" Apr 16 18:36:08.085887 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.085863 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-zjsn7\"" Apr 16 18:36:08.085986 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.085876 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 18:36:08.086819 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.086805 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 18:36:08.093914 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.093895 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-hlhl2"] Apr 16 18:36:08.096751 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.096735 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nh92\" (UniqueName: \"kubernetes.io/projected/3c02520e-20ee-4bcd-b138-6e635c8029e3-kube-api-access-2nh92\") pod \"cert-manager-cainjector-8966b78d4-hlhl2\" (UID: \"3c02520e-20ee-4bcd-b138-6e635c8029e3\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hlhl2" Apr 16 18:36:08.096823 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.096766 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c02520e-20ee-4bcd-b138-6e635c8029e3-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-hlhl2\" (UID: \"3c02520e-20ee-4bcd-b138-6e635c8029e3\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hlhl2" Apr 16 18:36:08.197948 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.197911 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nh92\" (UniqueName: \"kubernetes.io/projected/3c02520e-20ee-4bcd-b138-6e635c8029e3-kube-api-access-2nh92\") pod \"cert-manager-cainjector-8966b78d4-hlhl2\" (UID: \"3c02520e-20ee-4bcd-b138-6e635c8029e3\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hlhl2" Apr 16 18:36:08.198123 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.197967 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c02520e-20ee-4bcd-b138-6e635c8029e3-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-hlhl2\" (UID: \"3c02520e-20ee-4bcd-b138-6e635c8029e3\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hlhl2" Apr 16 18:36:08.207114 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.207083 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c02520e-20ee-4bcd-b138-6e635c8029e3-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-hlhl2\" (UID: \"3c02520e-20ee-4bcd-b138-6e635c8029e3\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hlhl2" Apr 16 18:36:08.207273 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.207255 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nh92\" (UniqueName: \"kubernetes.io/projected/3c02520e-20ee-4bcd-b138-6e635c8029e3-kube-api-access-2nh92\") pod \"cert-manager-cainjector-8966b78d4-hlhl2\" (UID: \"3c02520e-20ee-4bcd-b138-6e635c8029e3\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hlhl2" Apr 16 18:36:08.403466 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.403365 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-hlhl2" Apr 16 18:36:08.521636 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.521550 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-hlhl2"] Apr 16 18:36:08.524348 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:36:08.524318 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c02520e_20ee_4bcd_b138_6e635c8029e3.slice/crio-1b3f01d68e0b5bf3bf1a0e48aeb0dfb8f61839fc38b65c91e0f2ed7ebf76c37b WatchSource:0}: Error finding container 1b3f01d68e0b5bf3bf1a0e48aeb0dfb8f61839fc38b65c91e0f2ed7ebf76c37b: Status 404 returned error can't find the container with id 1b3f01d68e0b5bf3bf1a0e48aeb0dfb8f61839fc38b65c91e0f2ed7ebf76c37b Apr 16 18:36:08.526203 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.526184 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:36:08.605792 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:08.605766 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-hlhl2" event={"ID":"3c02520e-20ee-4bcd-b138-6e635c8029e3","Type":"ContainerStarted","Data":"1b3f01d68e0b5bf3bf1a0e48aeb0dfb8f61839fc38b65c91e0f2ed7ebf76c37b"} Apr 16 18:36:11.617327 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:11.617294 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-hlhl2" event={"ID":"3c02520e-20ee-4bcd-b138-6e635c8029e3","Type":"ContainerStarted","Data":"f1e33d33196dee8572edab6827a794969a1717671f9c405b713ad9531b944e4b"} Apr 16 18:36:11.641045 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:11.641000 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-hlhl2" podStartSLOduration=0.710644937 podStartE2EDuration="3.64098675s" podCreationTimestamp="2026-04-16 18:36:08 +0000 UTC" firstStartedPulling="2026-04-16 18:36:08.526387247 +0000 UTC m=+319.615650832" lastFinishedPulling="2026-04-16 18:36:11.456729074 +0000 UTC m=+322.545992645" observedRunningTime="2026-04-16 18:36:11.639415354 +0000 UTC m=+322.728678948" watchObservedRunningTime="2026-04-16 18:36:11.64098675 +0000 UTC m=+322.730250343" Apr 16 18:36:53.270540 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.270507 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7"] Apr 16 18:36:53.273881 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.273860 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:36:53.277332 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.277310 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-277fx\"" Apr 16 18:36:53.277510 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.277283 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 18:36:53.277639 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.277510 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 18:36:53.277774 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.277749 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 18:36:53.278176 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.278134 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:36:53.278279 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.278210 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 18:36:53.285918 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.285896 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7"] Apr 16 18:36:53.381624 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.381587 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0e2be300-3e78-4416-8f7f-ac42cbb89d9e-manager-config\") pod \"lws-controller-manager-7fd84c546d-q2vh7\" (UID: \"0e2be300-3e78-4416-8f7f-ac42cbb89d9e\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:36:53.381786 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.381659 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwmwl\" (UniqueName: \"kubernetes.io/projected/0e2be300-3e78-4416-8f7f-ac42cbb89d9e-kube-api-access-xwmwl\") pod \"lws-controller-manager-7fd84c546d-q2vh7\" (UID: \"0e2be300-3e78-4416-8f7f-ac42cbb89d9e\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:36:53.381786 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.381695 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0e2be300-3e78-4416-8f7f-ac42cbb89d9e-metrics-cert\") pod \"lws-controller-manager-7fd84c546d-q2vh7\" (UID: \"0e2be300-3e78-4416-8f7f-ac42cbb89d9e\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:36:53.381786 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.381714 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e2be300-3e78-4416-8f7f-ac42cbb89d9e-cert\") pod \"lws-controller-manager-7fd84c546d-q2vh7\" (UID: \"0e2be300-3e78-4416-8f7f-ac42cbb89d9e\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:36:53.482946 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.482918 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0e2be300-3e78-4416-8f7f-ac42cbb89d9e-manager-config\") pod \"lws-controller-manager-7fd84c546d-q2vh7\" (UID: \"0e2be300-3e78-4416-8f7f-ac42cbb89d9e\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:36:53.483076 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.482977 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwmwl\" (UniqueName: \"kubernetes.io/projected/0e2be300-3e78-4416-8f7f-ac42cbb89d9e-kube-api-access-xwmwl\") pod \"lws-controller-manager-7fd84c546d-q2vh7\" (UID: \"0e2be300-3e78-4416-8f7f-ac42cbb89d9e\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:36:53.483076 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.483009 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0e2be300-3e78-4416-8f7f-ac42cbb89d9e-metrics-cert\") pod \"lws-controller-manager-7fd84c546d-q2vh7\" (UID: \"0e2be300-3e78-4416-8f7f-ac42cbb89d9e\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:36:53.483076 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.483036 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e2be300-3e78-4416-8f7f-ac42cbb89d9e-cert\") pod \"lws-controller-manager-7fd84c546d-q2vh7\" (UID: \"0e2be300-3e78-4416-8f7f-ac42cbb89d9e\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:36:53.483701 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.483661 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0e2be300-3e78-4416-8f7f-ac42cbb89d9e-manager-config\") pod \"lws-controller-manager-7fd84c546d-q2vh7\" (UID: \"0e2be300-3e78-4416-8f7f-ac42cbb89d9e\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:36:53.485443 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.485422 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0e2be300-3e78-4416-8f7f-ac42cbb89d9e-metrics-cert\") pod \"lws-controller-manager-7fd84c546d-q2vh7\" (UID: \"0e2be300-3e78-4416-8f7f-ac42cbb89d9e\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:36:53.485534 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.485447 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e2be300-3e78-4416-8f7f-ac42cbb89d9e-cert\") pod \"lws-controller-manager-7fd84c546d-q2vh7\" (UID: \"0e2be300-3e78-4416-8f7f-ac42cbb89d9e\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:36:53.497230 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.497208 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwmwl\" (UniqueName: \"kubernetes.io/projected/0e2be300-3e78-4416-8f7f-ac42cbb89d9e-kube-api-access-xwmwl\") pod \"lws-controller-manager-7fd84c546d-q2vh7\" (UID: \"0e2be300-3e78-4416-8f7f-ac42cbb89d9e\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:36:53.583132 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.583066 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:36:53.709765 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.709737 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7"] Apr 16 18:36:53.711633 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:36:53.711603 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e2be300_3e78_4416_8f7f_ac42cbb89d9e.slice/crio-77d28c0bf6c0110ef84ad99f6f8d9e75cdddaa14b90418dc0bddad52e12cadd8 WatchSource:0}: Error finding container 77d28c0bf6c0110ef84ad99f6f8d9e75cdddaa14b90418dc0bddad52e12cadd8: Status 404 returned error can't find the container with id 77d28c0bf6c0110ef84ad99f6f8d9e75cdddaa14b90418dc0bddad52e12cadd8 Apr 16 18:36:53.741831 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:53.741804 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" event={"ID":"0e2be300-3e78-4416-8f7f-ac42cbb89d9e","Type":"ContainerStarted","Data":"77d28c0bf6c0110ef84ad99f6f8d9e75cdddaa14b90418dc0bddad52e12cadd8"} Apr 16 18:36:56.752892 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:56.752851 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" event={"ID":"0e2be300-3e78-4416-8f7f-ac42cbb89d9e","Type":"ContainerStarted","Data":"791dcce053a9f6907df7ffbdb9906eb95ce7f30cc9e292438aeed61eb10b579a"} Apr 16 18:36:56.753303 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:56.752951 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:36:56.785652 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:36:56.785604 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" podStartSLOduration=1.322034039 podStartE2EDuration="3.785591732s" podCreationTimestamp="2026-04-16 18:36:53 +0000 UTC" firstStartedPulling="2026-04-16 18:36:53.713446065 +0000 UTC m=+364.802709636" lastFinishedPulling="2026-04-16 18:36:56.177003759 +0000 UTC m=+367.266267329" observedRunningTime="2026-04-16 18:36:56.783476088 +0000 UTC m=+367.872739681" watchObservedRunningTime="2026-04-16 18:36:56.785591732 +0000 UTC m=+367.874855324" Apr 16 18:37:07.758377 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:07.758342 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-q2vh7" Apr 16 18:37:34.401133 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:34.401097 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6"] Apr 16 18:37:34.403693 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:34.403677 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6" Apr 16 18:37:34.408668 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:34.408644 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 18:37:34.409050 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:34.409030 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-jrpl9\"" Apr 16 18:37:34.409341 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:34.409327 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 18:37:34.419590 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:34.419570 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6"] Apr 16 18:37:34.543808 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:34.543777 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzm9z\" (UniqueName: \"kubernetes.io/projected/321dfd1b-d3ed-4840-a323-fab5e59a3836-kube-api-access-dzm9z\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-84zc6\" (UID: \"321dfd1b-d3ed-4840-a323-fab5e59a3836\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6" Apr 16 18:37:34.543977 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:34.543898 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/321dfd1b-d3ed-4840-a323-fab5e59a3836-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-84zc6\" (UID: \"321dfd1b-d3ed-4840-a323-fab5e59a3836\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6" Apr 16 18:37:34.645274 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:34.645236 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/321dfd1b-d3ed-4840-a323-fab5e59a3836-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-84zc6\" (UID: \"321dfd1b-d3ed-4840-a323-fab5e59a3836\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6" Apr 16 18:37:34.645490 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:34.645296 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzm9z\" (UniqueName: \"kubernetes.io/projected/321dfd1b-d3ed-4840-a323-fab5e59a3836-kube-api-access-dzm9z\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-84zc6\" (UID: \"321dfd1b-d3ed-4840-a323-fab5e59a3836\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6" Apr 16 18:37:34.645610 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:34.645590 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/321dfd1b-d3ed-4840-a323-fab5e59a3836-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-84zc6\" (UID: \"321dfd1b-d3ed-4840-a323-fab5e59a3836\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6" Apr 16 18:37:34.659382 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:34.659304 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzm9z\" (UniqueName: \"kubernetes.io/projected/321dfd1b-d3ed-4840-a323-fab5e59a3836-kube-api-access-dzm9z\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-84zc6\" (UID: \"321dfd1b-d3ed-4840-a323-fab5e59a3836\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6" Apr 16 18:37:34.713544 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:34.713515 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6" Apr 16 18:37:34.848206 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:34.848183 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6"] Apr 16 18:37:34.851189 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:37:34.851130 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod321dfd1b_d3ed_4840_a323_fab5e59a3836.slice/crio-aec1f7e9393ac1eafd43da1a4fd2adf7fe0da92dcab5fd9dc79cb4e2edb50318 WatchSource:0}: Error finding container aec1f7e9393ac1eafd43da1a4fd2adf7fe0da92dcab5fd9dc79cb4e2edb50318: Status 404 returned error can't find the container with id aec1f7e9393ac1eafd43da1a4fd2adf7fe0da92dcab5fd9dc79cb4e2edb50318 Apr 16 18:37:34.873789 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:34.873765 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6" event={"ID":"321dfd1b-d3ed-4840-a323-fab5e59a3836","Type":"ContainerStarted","Data":"aec1f7e9393ac1eafd43da1a4fd2adf7fe0da92dcab5fd9dc79cb4e2edb50318"} Apr 16 18:37:38.894953 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:38.894868 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6" event={"ID":"321dfd1b-d3ed-4840-a323-fab5e59a3836","Type":"ContainerStarted","Data":"07705c37bcfea5b26fd45e38b7c383fae935b1526dcaed03ac82650e9c76ed7e"} Apr 16 18:37:38.895381 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:38.895097 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6" Apr 16 18:37:38.917214 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:38.917159 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6" podStartSLOduration=1.183345073 podStartE2EDuration="4.917124329s" podCreationTimestamp="2026-04-16 18:37:34 +0000 UTC" firstStartedPulling="2026-04-16 18:37:34.853712639 +0000 UTC m=+405.942976215" lastFinishedPulling="2026-04-16 18:37:38.587491895 +0000 UTC m=+409.676755471" observedRunningTime="2026-04-16 18:37:38.914219971 +0000 UTC m=+410.003483564" watchObservedRunningTime="2026-04-16 18:37:38.917124329 +0000 UTC m=+410.006387924" Apr 16 18:37:49.901210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:37:49.901174 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-84zc6" Apr 16 18:38:23.842559 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:23.842478 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-wrkg9"] Apr 16 18:38:23.849468 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:23.849442 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" Apr 16 18:38:23.852520 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:23.852498 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-6sw6v\"" Apr 16 18:38:23.852843 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:23.852818 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 18:38:23.856030 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:23.856009 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-wrkg9"] Apr 16 18:38:23.944935 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:23.944902 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-wrkg9"] Apr 16 18:38:23.978115 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:23.978076 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j78f4\" (UniqueName: \"kubernetes.io/projected/589e6513-8147-4d34-b036-8abe488d0868-kube-api-access-j78f4\") pod \"limitador-limitador-64c8f475fb-wrkg9\" (UID: \"589e6513-8147-4d34-b036-8abe488d0868\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" Apr 16 18:38:23.978285 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:23.978123 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/589e6513-8147-4d34-b036-8abe488d0868-config-file\") pod \"limitador-limitador-64c8f475fb-wrkg9\" (UID: \"589e6513-8147-4d34-b036-8abe488d0868\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" Apr 16 18:38:24.079055 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.079023 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j78f4\" (UniqueName: \"kubernetes.io/projected/589e6513-8147-4d34-b036-8abe488d0868-kube-api-access-j78f4\") pod \"limitador-limitador-64c8f475fb-wrkg9\" (UID: \"589e6513-8147-4d34-b036-8abe488d0868\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" Apr 16 18:38:24.079055 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.079060 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/589e6513-8147-4d34-b036-8abe488d0868-config-file\") pod \"limitador-limitador-64c8f475fb-wrkg9\" (UID: \"589e6513-8147-4d34-b036-8abe488d0868\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" Apr 16 18:38:24.079609 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.079592 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/589e6513-8147-4d34-b036-8abe488d0868-config-file\") pod \"limitador-limitador-64c8f475fb-wrkg9\" (UID: \"589e6513-8147-4d34-b036-8abe488d0868\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" Apr 16 18:38:24.088077 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.088051 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j78f4\" (UniqueName: \"kubernetes.io/projected/589e6513-8147-4d34-b036-8abe488d0868-kube-api-access-j78f4\") pod \"limitador-limitador-64c8f475fb-wrkg9\" (UID: \"589e6513-8147-4d34-b036-8abe488d0868\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" Apr 16 18:38:24.161321 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.161227 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" Apr 16 18:38:24.287907 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.287880 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-wrkg9"] Apr 16 18:38:24.289749 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:38:24.289715 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod589e6513_8147_4d34_b036_8abe488d0868.slice/crio-cffbfe1a432b6d4f40a660edee60c2990fa8ebe28a411996dbc08846f0834642 WatchSource:0}: Error finding container cffbfe1a432b6d4f40a660edee60c2990fa8ebe28a411996dbc08846f0834642: Status 404 returned error can't find the container with id cffbfe1a432b6d4f40a660edee60c2990fa8ebe28a411996dbc08846f0834642 Apr 16 18:38:24.493323 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.493288 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6844f85596-v6nmk"] Apr 16 18:38:24.498054 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.498030 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.507829 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.507802 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6844f85596-v6nmk"] Apr 16 18:38:24.583779 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.583743 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-service-ca\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.583779 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.583777 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-console-config\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.583986 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.583809 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-console-oauth-config\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.583986 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.583868 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q299\" (UniqueName: \"kubernetes.io/projected/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-kube-api-access-8q299\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.583986 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.583905 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-console-serving-cert\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.583986 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.583932 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-trusted-ca-bundle\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.584108 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.583982 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-oauth-serving-cert\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.685339 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.685305 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-service-ca\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.685339 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.685345 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-console-config\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.685557 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.685376 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-console-oauth-config\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.685557 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.685420 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q299\" (UniqueName: \"kubernetes.io/projected/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-kube-api-access-8q299\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.685557 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.685462 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-console-serving-cert\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.685695 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.685659 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-trusted-ca-bundle\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.685750 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.685727 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-oauth-serving-cert\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.686175 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.686129 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-console-config\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.686298 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.686183 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-service-ca\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.686356 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.686309 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-oauth-serving-cert\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.686482 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.686461 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-trusted-ca-bundle\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.688099 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.688073 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-console-oauth-config\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.688221 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.688204 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-console-serving-cert\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.694275 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.694250 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q299\" (UniqueName: \"kubernetes.io/projected/3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd-kube-api-access-8q299\") pod \"console-6844f85596-v6nmk\" (UID: \"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd\") " pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.713197 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.713171 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-8hfzt"] Apr 16 18:38:24.717771 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.717751 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-8hfzt" Apr 16 18:38:24.721603 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.721582 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-7zdnf\"" Apr 16 18:38:24.725491 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.725467 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-8hfzt"] Apr 16 18:38:24.808372 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.808280 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:24.886750 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.886721 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkmn4\" (UniqueName: \"kubernetes.io/projected/0318fd11-5052-44f5-9c8e-b051b67ef2d8-kube-api-access-kkmn4\") pod \"authorino-674b59b84c-8hfzt\" (UID: \"0318fd11-5052-44f5-9c8e-b051b67ef2d8\") " pod="kuadrant-system/authorino-674b59b84c-8hfzt" Apr 16 18:38:24.930933 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.930904 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-dssmk"] Apr 16 18:38:24.935733 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.935714 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6844f85596-v6nmk"] Apr 16 18:38:24.935872 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.935820 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-dssmk" Apr 16 18:38:24.938017 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:38:24.937988 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bcd6742_aa1c_4ae5_baa3_c5856b2c17cd.slice/crio-deef9877a7b94e349998af47afd9f1c2e43b87647f2de63dc259b4ef5c70d625 WatchSource:0}: Error finding container deef9877a7b94e349998af47afd9f1c2e43b87647f2de63dc259b4ef5c70d625: Status 404 returned error can't find the container with id deef9877a7b94e349998af47afd9f1c2e43b87647f2de63dc259b4ef5c70d625 Apr 16 18:38:24.939727 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.939707 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-dssmk"] Apr 16 18:38:24.987339 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.987314 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkmn4\" (UniqueName: \"kubernetes.io/projected/0318fd11-5052-44f5-9c8e-b051b67ef2d8-kube-api-access-kkmn4\") pod \"authorino-674b59b84c-8hfzt\" (UID: \"0318fd11-5052-44f5-9c8e-b051b67ef2d8\") " pod="kuadrant-system/authorino-674b59b84c-8hfzt" Apr 16 18:38:24.997506 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:24.997486 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkmn4\" (UniqueName: \"kubernetes.io/projected/0318fd11-5052-44f5-9c8e-b051b67ef2d8-kube-api-access-kkmn4\") pod \"authorino-674b59b84c-8hfzt\" (UID: \"0318fd11-5052-44f5-9c8e-b051b67ef2d8\") " pod="kuadrant-system/authorino-674b59b84c-8hfzt" Apr 16 18:38:25.029422 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:25.029390 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-8hfzt" Apr 16 18:38:25.045425 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:25.045387 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" event={"ID":"589e6513-8147-4d34-b036-8abe488d0868","Type":"ContainerStarted","Data":"cffbfe1a432b6d4f40a660edee60c2990fa8ebe28a411996dbc08846f0834642"} Apr 16 18:38:25.047871 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:25.047843 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6844f85596-v6nmk" event={"ID":"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd","Type":"ContainerStarted","Data":"f60a8c5ecc848fee614461d513183402b97d9bd74b8954193df77f078a567310"} Apr 16 18:38:25.047988 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:25.047877 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6844f85596-v6nmk" event={"ID":"3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd","Type":"ContainerStarted","Data":"deef9877a7b94e349998af47afd9f1c2e43b87647f2de63dc259b4ef5c70d625"} Apr 16 18:38:25.075686 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:25.075543 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6844f85596-v6nmk" podStartSLOduration=1.075527897 podStartE2EDuration="1.075527897s" podCreationTimestamp="2026-04-16 18:38:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:38:25.074293144 +0000 UTC m=+456.163556738" watchObservedRunningTime="2026-04-16 18:38:25.075527897 +0000 UTC m=+456.164791490" Apr 16 18:38:25.088775 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:25.088731 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbpwn\" (UniqueName: \"kubernetes.io/projected/394e0633-bca8-4b16-ab53-bbe00b6f505b-kube-api-access-mbpwn\") pod \"authorino-79cbc94b89-dssmk\" (UID: \"394e0633-bca8-4b16-ab53-bbe00b6f505b\") " pod="kuadrant-system/authorino-79cbc94b89-dssmk" Apr 16 18:38:25.156263 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:25.156238 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-8hfzt"] Apr 16 18:38:25.157487 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:38:25.157457 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0318fd11_5052_44f5_9c8e_b051b67ef2d8.slice/crio-7a6ecc6319a3515cf7b364734eb7ebfa62dce74cda21ad7c20ea773479ae633d WatchSource:0}: Error finding container 7a6ecc6319a3515cf7b364734eb7ebfa62dce74cda21ad7c20ea773479ae633d: Status 404 returned error can't find the container with id 7a6ecc6319a3515cf7b364734eb7ebfa62dce74cda21ad7c20ea773479ae633d Apr 16 18:38:25.189596 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:25.189530 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbpwn\" (UniqueName: \"kubernetes.io/projected/394e0633-bca8-4b16-ab53-bbe00b6f505b-kube-api-access-mbpwn\") pod \"authorino-79cbc94b89-dssmk\" (UID: \"394e0633-bca8-4b16-ab53-bbe00b6f505b\") " pod="kuadrant-system/authorino-79cbc94b89-dssmk" Apr 16 18:38:25.199994 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:25.199960 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbpwn\" (UniqueName: \"kubernetes.io/projected/394e0633-bca8-4b16-ab53-bbe00b6f505b-kube-api-access-mbpwn\") pod \"authorino-79cbc94b89-dssmk\" (UID: \"394e0633-bca8-4b16-ab53-bbe00b6f505b\") " pod="kuadrant-system/authorino-79cbc94b89-dssmk" Apr 16 18:38:25.245369 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:25.245332 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-dssmk" Apr 16 18:38:25.407934 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:25.407900 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-dssmk"] Apr 16 18:38:25.410511 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:38:25.410476 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod394e0633_bca8_4b16_ab53_bbe00b6f505b.slice/crio-a3be1bae55aaabb6f2502ffc1ee9c182a1ba172a092346f3d6fd25463e195332 WatchSource:0}: Error finding container a3be1bae55aaabb6f2502ffc1ee9c182a1ba172a092346f3d6fd25463e195332: Status 404 returned error can't find the container with id a3be1bae55aaabb6f2502ffc1ee9c182a1ba172a092346f3d6fd25463e195332 Apr 16 18:38:26.054564 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:26.054507 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-dssmk" event={"ID":"394e0633-bca8-4b16-ab53-bbe00b6f505b","Type":"ContainerStarted","Data":"a3be1bae55aaabb6f2502ffc1ee9c182a1ba172a092346f3d6fd25463e195332"} Apr 16 18:38:26.056106 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:26.056043 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-8hfzt" event={"ID":"0318fd11-5052-44f5-9c8e-b051b67ef2d8","Type":"ContainerStarted","Data":"7a6ecc6319a3515cf7b364734eb7ebfa62dce74cda21ad7c20ea773479ae633d"} Apr 16 18:38:29.068608 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:29.068571 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" event={"ID":"589e6513-8147-4d34-b036-8abe488d0868","Type":"ContainerStarted","Data":"1cc3cac2153f23075afac82a65c77538054e71eed53e9c378c701047ce1a91da"} Apr 16 18:38:29.069063 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:29.068621 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" Apr 16 18:38:29.069889 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:29.069857 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-dssmk" event={"ID":"394e0633-bca8-4b16-ab53-bbe00b6f505b","Type":"ContainerStarted","Data":"3f20ab558480a46cf150d26ce8467af9ddab27f88540d2f7f54e9c561cc43035"} Apr 16 18:38:29.071006 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:29.070984 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-8hfzt" event={"ID":"0318fd11-5052-44f5-9c8e-b051b67ef2d8","Type":"ContainerStarted","Data":"ccfabdf01e9d23b5094885096e2093965c2c1b5d1feb6975b708a1a51aa7a086"} Apr 16 18:38:29.088210 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:29.088129 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" podStartSLOduration=1.773738612 podStartE2EDuration="6.08811286s" podCreationTimestamp="2026-04-16 18:38:23 +0000 UTC" firstStartedPulling="2026-04-16 18:38:24.291989318 +0000 UTC m=+455.381252890" lastFinishedPulling="2026-04-16 18:38:28.606363568 +0000 UTC m=+459.695627138" observedRunningTime="2026-04-16 18:38:29.085765603 +0000 UTC m=+460.175029196" watchObservedRunningTime="2026-04-16 18:38:29.08811286 +0000 UTC m=+460.177376453" Apr 16 18:38:29.102340 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:29.102288 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-8hfzt" podStartSLOduration=1.7228118110000001 podStartE2EDuration="5.102272544s" podCreationTimestamp="2026-04-16 18:38:24 +0000 UTC" firstStartedPulling="2026-04-16 18:38:25.15902689 +0000 UTC m=+456.248290461" lastFinishedPulling="2026-04-16 18:38:28.538487623 +0000 UTC m=+459.627751194" observedRunningTime="2026-04-16 18:38:29.101377908 +0000 UTC m=+460.190641503" watchObservedRunningTime="2026-04-16 18:38:29.102272544 +0000 UTC m=+460.191536137" Apr 16 18:38:29.120766 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:29.120702 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-dssmk" podStartSLOduration=1.925949505 podStartE2EDuration="5.12068249s" podCreationTimestamp="2026-04-16 18:38:24 +0000 UTC" firstStartedPulling="2026-04-16 18:38:25.412342806 +0000 UTC m=+456.501606390" lastFinishedPulling="2026-04-16 18:38:28.607075791 +0000 UTC m=+459.696339375" observedRunningTime="2026-04-16 18:38:29.119733287 +0000 UTC m=+460.208996882" watchObservedRunningTime="2026-04-16 18:38:29.12068249 +0000 UTC m=+460.209946084" Apr 16 18:38:29.147266 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:29.147233 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-8hfzt"] Apr 16 18:38:31.077847 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:31.077802 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-8hfzt" podUID="0318fd11-5052-44f5-9c8e-b051b67ef2d8" containerName="authorino" containerID="cri-o://ccfabdf01e9d23b5094885096e2093965c2c1b5d1feb6975b708a1a51aa7a086" gracePeriod=30 Apr 16 18:38:31.319040 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:31.319018 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-8hfzt" Apr 16 18:38:31.449482 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:31.449401 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkmn4\" (UniqueName: \"kubernetes.io/projected/0318fd11-5052-44f5-9c8e-b051b67ef2d8-kube-api-access-kkmn4\") pod \"0318fd11-5052-44f5-9c8e-b051b67ef2d8\" (UID: \"0318fd11-5052-44f5-9c8e-b051b67ef2d8\") " Apr 16 18:38:31.451563 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:31.451537 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0318fd11-5052-44f5-9c8e-b051b67ef2d8-kube-api-access-kkmn4" (OuterVolumeSpecName: "kube-api-access-kkmn4") pod "0318fd11-5052-44f5-9c8e-b051b67ef2d8" (UID: "0318fd11-5052-44f5-9c8e-b051b67ef2d8"). InnerVolumeSpecName "kube-api-access-kkmn4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:38:31.550537 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:31.550509 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kkmn4\" (UniqueName: \"kubernetes.io/projected/0318fd11-5052-44f5-9c8e-b051b67ef2d8-kube-api-access-kkmn4\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:38:32.082363 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:32.082330 2580 generic.go:358] "Generic (PLEG): container finished" podID="0318fd11-5052-44f5-9c8e-b051b67ef2d8" containerID="ccfabdf01e9d23b5094885096e2093965c2c1b5d1feb6975b708a1a51aa7a086" exitCode=0 Apr 16 18:38:32.082767 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:32.082377 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-8hfzt" event={"ID":"0318fd11-5052-44f5-9c8e-b051b67ef2d8","Type":"ContainerDied","Data":"ccfabdf01e9d23b5094885096e2093965c2c1b5d1feb6975b708a1a51aa7a086"} Apr 16 18:38:32.082767 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:32.082380 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-8hfzt" Apr 16 18:38:32.082767 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:32.082397 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-8hfzt" event={"ID":"0318fd11-5052-44f5-9c8e-b051b67ef2d8","Type":"ContainerDied","Data":"7a6ecc6319a3515cf7b364734eb7ebfa62dce74cda21ad7c20ea773479ae633d"} Apr 16 18:38:32.082767 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:32.082418 2580 scope.go:117] "RemoveContainer" containerID="ccfabdf01e9d23b5094885096e2093965c2c1b5d1feb6975b708a1a51aa7a086" Apr 16 18:38:32.090243 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:32.090226 2580 scope.go:117] "RemoveContainer" containerID="ccfabdf01e9d23b5094885096e2093965c2c1b5d1feb6975b708a1a51aa7a086" Apr 16 18:38:32.090490 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:38:32.090469 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccfabdf01e9d23b5094885096e2093965c2c1b5d1feb6975b708a1a51aa7a086\": container with ID starting with ccfabdf01e9d23b5094885096e2093965c2c1b5d1feb6975b708a1a51aa7a086 not found: ID does not exist" containerID="ccfabdf01e9d23b5094885096e2093965c2c1b5d1feb6975b708a1a51aa7a086" Apr 16 18:38:32.090584 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:32.090499 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccfabdf01e9d23b5094885096e2093965c2c1b5d1feb6975b708a1a51aa7a086"} err="failed to get container status \"ccfabdf01e9d23b5094885096e2093965c2c1b5d1feb6975b708a1a51aa7a086\": rpc error: code = NotFound desc = could not find container \"ccfabdf01e9d23b5094885096e2093965c2c1b5d1feb6975b708a1a51aa7a086\": container with ID starting with ccfabdf01e9d23b5094885096e2093965c2c1b5d1feb6975b708a1a51aa7a086 not found: ID does not exist" Apr 16 18:38:32.100633 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:32.100612 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-8hfzt"] Apr 16 18:38:32.105526 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:32.105507 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-8hfzt"] Apr 16 18:38:33.539241 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:33.539212 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0318fd11-5052-44f5-9c8e-b051b67ef2d8" path="/var/lib/kubelet/pods/0318fd11-5052-44f5-9c8e-b051b67ef2d8/volumes" Apr 16 18:38:34.808853 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:34.808821 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:34.809245 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:34.808869 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:34.813384 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:34.813363 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:35.100726 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:35.100649 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6844f85596-v6nmk" Apr 16 18:38:35.156201 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:35.156117 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-89ccddb4-ql8kd"] Apr 16 18:38:40.076298 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:40.076268 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" Apr 16 18:38:40.092076 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:40.092040 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-wrkg9"] Apr 16 18:38:40.112590 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:40.112553 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" podUID="589e6513-8147-4d34-b036-8abe488d0868" containerName="limitador" containerID="cri-o://1cc3cac2153f23075afac82a65c77538054e71eed53e9c378c701047ce1a91da" gracePeriod=30 Apr 16 18:38:40.657753 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:40.657731 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" Apr 16 18:38:40.732949 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:40.732918 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j78f4\" (UniqueName: \"kubernetes.io/projected/589e6513-8147-4d34-b036-8abe488d0868-kube-api-access-j78f4\") pod \"589e6513-8147-4d34-b036-8abe488d0868\" (UID: \"589e6513-8147-4d34-b036-8abe488d0868\") " Apr 16 18:38:40.733103 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:40.732984 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/589e6513-8147-4d34-b036-8abe488d0868-config-file\") pod \"589e6513-8147-4d34-b036-8abe488d0868\" (UID: \"589e6513-8147-4d34-b036-8abe488d0868\") " Apr 16 18:38:40.733335 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:40.733312 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/589e6513-8147-4d34-b036-8abe488d0868-config-file" (OuterVolumeSpecName: "config-file") pod "589e6513-8147-4d34-b036-8abe488d0868" (UID: "589e6513-8147-4d34-b036-8abe488d0868"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:38:40.734969 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:40.734945 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/589e6513-8147-4d34-b036-8abe488d0868-kube-api-access-j78f4" (OuterVolumeSpecName: "kube-api-access-j78f4") pod "589e6513-8147-4d34-b036-8abe488d0868" (UID: "589e6513-8147-4d34-b036-8abe488d0868"). InnerVolumeSpecName "kube-api-access-j78f4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:38:40.833693 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:40.833662 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j78f4\" (UniqueName: \"kubernetes.io/projected/589e6513-8147-4d34-b036-8abe488d0868-kube-api-access-j78f4\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:38:40.833693 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:40.833688 2580 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/589e6513-8147-4d34-b036-8abe488d0868-config-file\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:38:41.117212 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:41.117112 2580 generic.go:358] "Generic (PLEG): container finished" podID="589e6513-8147-4d34-b036-8abe488d0868" containerID="1cc3cac2153f23075afac82a65c77538054e71eed53e9c378c701047ce1a91da" exitCode=0 Apr 16 18:38:41.117212 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:41.117196 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" Apr 16 18:38:41.117644 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:41.117195 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" event={"ID":"589e6513-8147-4d34-b036-8abe488d0868","Type":"ContainerDied","Data":"1cc3cac2153f23075afac82a65c77538054e71eed53e9c378c701047ce1a91da"} Apr 16 18:38:41.117644 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:41.117306 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-wrkg9" event={"ID":"589e6513-8147-4d34-b036-8abe488d0868","Type":"ContainerDied","Data":"cffbfe1a432b6d4f40a660edee60c2990fa8ebe28a411996dbc08846f0834642"} Apr 16 18:38:41.117644 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:41.117322 2580 scope.go:117] "RemoveContainer" containerID="1cc3cac2153f23075afac82a65c77538054e71eed53e9c378c701047ce1a91da" Apr 16 18:38:41.125600 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:41.125585 2580 scope.go:117] "RemoveContainer" containerID="1cc3cac2153f23075afac82a65c77538054e71eed53e9c378c701047ce1a91da" Apr 16 18:38:41.125836 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:38:41.125819 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cc3cac2153f23075afac82a65c77538054e71eed53e9c378c701047ce1a91da\": container with ID starting with 1cc3cac2153f23075afac82a65c77538054e71eed53e9c378c701047ce1a91da not found: ID does not exist" containerID="1cc3cac2153f23075afac82a65c77538054e71eed53e9c378c701047ce1a91da" Apr 16 18:38:41.125877 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:41.125843 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc3cac2153f23075afac82a65c77538054e71eed53e9c378c701047ce1a91da"} err="failed to get container status \"1cc3cac2153f23075afac82a65c77538054e71eed53e9c378c701047ce1a91da\": rpc error: code = NotFound desc = could not find container \"1cc3cac2153f23075afac82a65c77538054e71eed53e9c378c701047ce1a91da\": container with ID starting with 1cc3cac2153f23075afac82a65c77538054e71eed53e9c378c701047ce1a91da not found: ID does not exist" Apr 16 18:38:41.143429 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:41.141206 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-wrkg9"] Apr 16 18:38:41.148286 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:41.148263 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-wrkg9"] Apr 16 18:38:41.538936 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:41.538906 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="589e6513-8147-4d34-b036-8abe488d0868" path="/var/lib/kubelet/pods/589e6513-8147-4d34-b036-8abe488d0868/volumes" Apr 16 18:38:49.505869 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.505840 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-cks8l"] Apr 16 18:38:49.506224 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.506211 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0318fd11-5052-44f5-9c8e-b051b67ef2d8" containerName="authorino" Apr 16 18:38:49.506224 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.506225 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0318fd11-5052-44f5-9c8e-b051b67ef2d8" containerName="authorino" Apr 16 18:38:49.506300 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.506239 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="589e6513-8147-4d34-b036-8abe488d0868" containerName="limitador" Apr 16 18:38:49.506300 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.506244 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="589e6513-8147-4d34-b036-8abe488d0868" containerName="limitador" Apr 16 18:38:49.506300 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.506293 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="589e6513-8147-4d34-b036-8abe488d0868" containerName="limitador" Apr 16 18:38:49.506300 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.506301 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0318fd11-5052-44f5-9c8e-b051b67ef2d8" containerName="authorino" Apr 16 18:38:49.510647 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.510629 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-cks8l" Apr 16 18:38:49.513309 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.513289 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 18:38:49.516543 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.516517 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-cks8l"] Apr 16 18:38:49.615178 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.615128 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c9hz\" (UniqueName: \"kubernetes.io/projected/5f4f52a0-a6f3-4dc7-84a9-dfc0be467776-kube-api-access-8c9hz\") pod \"authorino-68bd676465-cks8l\" (UID: \"5f4f52a0-a6f3-4dc7-84a9-dfc0be467776\") " pod="kuadrant-system/authorino-68bd676465-cks8l" Apr 16 18:38:49.615338 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.615213 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/5f4f52a0-a6f3-4dc7-84a9-dfc0be467776-tls-cert\") pod \"authorino-68bd676465-cks8l\" (UID: \"5f4f52a0-a6f3-4dc7-84a9-dfc0be467776\") " pod="kuadrant-system/authorino-68bd676465-cks8l" Apr 16 18:38:49.715837 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.715751 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/5f4f52a0-a6f3-4dc7-84a9-dfc0be467776-tls-cert\") pod \"authorino-68bd676465-cks8l\" (UID: \"5f4f52a0-a6f3-4dc7-84a9-dfc0be467776\") " pod="kuadrant-system/authorino-68bd676465-cks8l" Apr 16 18:38:49.715837 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.715828 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8c9hz\" (UniqueName: \"kubernetes.io/projected/5f4f52a0-a6f3-4dc7-84a9-dfc0be467776-kube-api-access-8c9hz\") pod \"authorino-68bd676465-cks8l\" (UID: \"5f4f52a0-a6f3-4dc7-84a9-dfc0be467776\") " pod="kuadrant-system/authorino-68bd676465-cks8l" Apr 16 18:38:49.718262 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.718236 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/5f4f52a0-a6f3-4dc7-84a9-dfc0be467776-tls-cert\") pod \"authorino-68bd676465-cks8l\" (UID: \"5f4f52a0-a6f3-4dc7-84a9-dfc0be467776\") " pod="kuadrant-system/authorino-68bd676465-cks8l" Apr 16 18:38:49.723705 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.723685 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c9hz\" (UniqueName: \"kubernetes.io/projected/5f4f52a0-a6f3-4dc7-84a9-dfc0be467776-kube-api-access-8c9hz\") pod \"authorino-68bd676465-cks8l\" (UID: \"5f4f52a0-a6f3-4dc7-84a9-dfc0be467776\") " pod="kuadrant-system/authorino-68bd676465-cks8l" Apr 16 18:38:49.820481 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.820438 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-cks8l" Apr 16 18:38:49.937577 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:49.937543 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-cks8l"] Apr 16 18:38:49.941722 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:38:49.941693 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f4f52a0_a6f3_4dc7_84a9_dfc0be467776.slice/crio-59aa9bc45432bf722be9112a9cd334a9803130431b10ae273cec1a64a36f83bc WatchSource:0}: Error finding container 59aa9bc45432bf722be9112a9cd334a9803130431b10ae273cec1a64a36f83bc: Status 404 returned error can't find the container with id 59aa9bc45432bf722be9112a9cd334a9803130431b10ae273cec1a64a36f83bc Apr 16 18:38:50.150042 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:50.150012 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-cks8l" event={"ID":"5f4f52a0-a6f3-4dc7-84a9-dfc0be467776","Type":"ContainerStarted","Data":"59aa9bc45432bf722be9112a9cd334a9803130431b10ae273cec1a64a36f83bc"} Apr 16 18:38:51.155002 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:51.154966 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-cks8l" event={"ID":"5f4f52a0-a6f3-4dc7-84a9-dfc0be467776","Type":"ContainerStarted","Data":"b0398d8d3ef10368b68a8015ede9b1560166fa761c728cfa681410ce12e29514"} Apr 16 18:38:51.172762 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:51.172706 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-cks8l" podStartSLOduration=1.717538608 podStartE2EDuration="2.1726898s" podCreationTimestamp="2026-04-16 18:38:49 +0000 UTC" firstStartedPulling="2026-04-16 18:38:49.943903114 +0000 UTC m=+481.033166685" lastFinishedPulling="2026-04-16 18:38:50.399054304 +0000 UTC m=+481.488317877" observedRunningTime="2026-04-16 18:38:51.171176119 +0000 UTC m=+482.260439713" watchObservedRunningTime="2026-04-16 18:38:51.1726898 +0000 UTC m=+482.261953393" Apr 16 18:38:51.204570 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:51.204529 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-dssmk"] Apr 16 18:38:51.204832 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:51.204804 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-dssmk" podUID="394e0633-bca8-4b16-ab53-bbe00b6f505b" containerName="authorino" containerID="cri-o://3f20ab558480a46cf150d26ce8467af9ddab27f88540d2f7f54e9c561cc43035" gracePeriod=30 Apr 16 18:38:51.442130 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:51.442108 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-dssmk" Apr 16 18:38:51.533823 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:51.533788 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbpwn\" (UniqueName: \"kubernetes.io/projected/394e0633-bca8-4b16-ab53-bbe00b6f505b-kube-api-access-mbpwn\") pod \"394e0633-bca8-4b16-ab53-bbe00b6f505b\" (UID: \"394e0633-bca8-4b16-ab53-bbe00b6f505b\") " Apr 16 18:38:51.536195 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:51.536164 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394e0633-bca8-4b16-ab53-bbe00b6f505b-kube-api-access-mbpwn" (OuterVolumeSpecName: "kube-api-access-mbpwn") pod "394e0633-bca8-4b16-ab53-bbe00b6f505b" (UID: "394e0633-bca8-4b16-ab53-bbe00b6f505b"). InnerVolumeSpecName "kube-api-access-mbpwn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:38:51.634588 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:51.634552 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mbpwn\" (UniqueName: \"kubernetes.io/projected/394e0633-bca8-4b16-ab53-bbe00b6f505b-kube-api-access-mbpwn\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:38:52.158961 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:52.158929 2580 generic.go:358] "Generic (PLEG): container finished" podID="394e0633-bca8-4b16-ab53-bbe00b6f505b" containerID="3f20ab558480a46cf150d26ce8467af9ddab27f88540d2f7f54e9c561cc43035" exitCode=0 Apr 16 18:38:52.159392 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:52.158977 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-dssmk" Apr 16 18:38:52.159392 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:52.159012 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-dssmk" event={"ID":"394e0633-bca8-4b16-ab53-bbe00b6f505b","Type":"ContainerDied","Data":"3f20ab558480a46cf150d26ce8467af9ddab27f88540d2f7f54e9c561cc43035"} Apr 16 18:38:52.159392 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:52.159049 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-dssmk" event={"ID":"394e0633-bca8-4b16-ab53-bbe00b6f505b","Type":"ContainerDied","Data":"a3be1bae55aaabb6f2502ffc1ee9c182a1ba172a092346f3d6fd25463e195332"} Apr 16 18:38:52.159392 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:52.159069 2580 scope.go:117] "RemoveContainer" containerID="3f20ab558480a46cf150d26ce8467af9ddab27f88540d2f7f54e9c561cc43035" Apr 16 18:38:52.167324 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:52.167303 2580 scope.go:117] "RemoveContainer" containerID="3f20ab558480a46cf150d26ce8467af9ddab27f88540d2f7f54e9c561cc43035" Apr 16 18:38:52.167639 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:38:52.167606 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f20ab558480a46cf150d26ce8467af9ddab27f88540d2f7f54e9c561cc43035\": container with ID starting with 3f20ab558480a46cf150d26ce8467af9ddab27f88540d2f7f54e9c561cc43035 not found: ID does not exist" containerID="3f20ab558480a46cf150d26ce8467af9ddab27f88540d2f7f54e9c561cc43035" Apr 16 18:38:52.167718 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:52.167653 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f20ab558480a46cf150d26ce8467af9ddab27f88540d2f7f54e9c561cc43035"} err="failed to get container status \"3f20ab558480a46cf150d26ce8467af9ddab27f88540d2f7f54e9c561cc43035\": rpc error: code = NotFound desc = could not find container \"3f20ab558480a46cf150d26ce8467af9ddab27f88540d2f7f54e9c561cc43035\": container with ID starting with 3f20ab558480a46cf150d26ce8467af9ddab27f88540d2f7f54e9c561cc43035 not found: ID does not exist" Apr 16 18:38:52.176545 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:52.176523 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-dssmk"] Apr 16 18:38:52.181817 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:52.181793 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-dssmk"] Apr 16 18:38:53.539462 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:38:53.539428 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="394e0633-bca8-4b16-ab53-bbe00b6f505b" path="/var/lib/kubelet/pods/394e0633-bca8-4b16-ab53-bbe00b6f505b/volumes" Apr 16 18:39:00.023536 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.023504 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz"] Apr 16 18:39:00.023993 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.023861 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="394e0633-bca8-4b16-ab53-bbe00b6f505b" containerName="authorino" Apr 16 18:39:00.023993 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.023872 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="394e0633-bca8-4b16-ab53-bbe00b6f505b" containerName="authorino" Apr 16 18:39:00.023993 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.023933 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="394e0633-bca8-4b16-ab53-bbe00b6f505b" containerName="authorino" Apr 16 18:39:00.028384 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.028368 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.031319 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.031294 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 18:39:00.031486 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.031385 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 18:39:00.031486 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.031412 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-djg2l\"" Apr 16 18:39:00.031972 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.031954 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 18:39:00.032270 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.032253 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 18:39:00.032637 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.032620 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 18:39:00.032690 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.032672 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 18:39:00.042632 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.042608 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz"] Apr 16 18:39:00.110807 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.110768 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/dcc00ead-054e-43b8-8a01-9b7238ac3473-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.110994 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.110823 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/dcc00ead-054e-43b8-8a01-9b7238ac3473-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.110994 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.110862 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/dcc00ead-054e-43b8-8a01-9b7238ac3473-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.110994 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.110921 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dcc00ead-054e-43b8-8a01-9b7238ac3473-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.110994 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.110979 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/dcc00ead-054e-43b8-8a01-9b7238ac3473-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.111193 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.111007 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmsdt\" (UniqueName: \"kubernetes.io/projected/dcc00ead-054e-43b8-8a01-9b7238ac3473-kube-api-access-bmsdt\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.111193 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.111029 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/dcc00ead-054e-43b8-8a01-9b7238ac3473-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.176551 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.176503 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-89ccddb4-ql8kd" podUID="2e17c4e0-04ae-49d0-8c52-562cf1661296" containerName="console" containerID="cri-o://33f45120441c431e9033b5a1491eb374d450f618cb1b17af816dbc1e46073000" gracePeriod=15 Apr 16 18:39:00.212423 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.212388 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dcc00ead-054e-43b8-8a01-9b7238ac3473-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.212599 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.212444 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/dcc00ead-054e-43b8-8a01-9b7238ac3473-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.212599 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.212471 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmsdt\" (UniqueName: \"kubernetes.io/projected/dcc00ead-054e-43b8-8a01-9b7238ac3473-kube-api-access-bmsdt\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.212599 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.212498 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/dcc00ead-054e-43b8-8a01-9b7238ac3473-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.212599 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.212563 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/dcc00ead-054e-43b8-8a01-9b7238ac3473-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.212822 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.212605 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/dcc00ead-054e-43b8-8a01-9b7238ac3473-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.212822 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.212637 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/dcc00ead-054e-43b8-8a01-9b7238ac3473-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.213291 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.213265 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/dcc00ead-054e-43b8-8a01-9b7238ac3473-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.214859 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.214833 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/dcc00ead-054e-43b8-8a01-9b7238ac3473-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.215030 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.214864 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/dcc00ead-054e-43b8-8a01-9b7238ac3473-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.215130 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.215113 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dcc00ead-054e-43b8-8a01-9b7238ac3473-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.215276 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.215259 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/dcc00ead-054e-43b8-8a01-9b7238ac3473-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.223765 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.223743 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/dcc00ead-054e-43b8-8a01-9b7238ac3473-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.223979 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.223961 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmsdt\" (UniqueName: \"kubernetes.io/projected/dcc00ead-054e-43b8-8a01-9b7238ac3473-kube-api-access-bmsdt\") pod \"istiod-openshift-gateway-55ff986f96-qr4wz\" (UID: \"dcc00ead-054e-43b8-8a01-9b7238ac3473\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.338098 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.338058 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:00.445399 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.445375 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-89ccddb4-ql8kd_2e17c4e0-04ae-49d0-8c52-562cf1661296/console/0.log" Apr 16 18:39:00.445496 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.445455 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:39:00.548010 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.547985 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz"] Apr 16 18:39:00.549841 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:39:00.549813 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcc00ead_054e_43b8_8a01_9b7238ac3473.slice/crio-ccb55f5b04d1821ea1ec96786e76b571b766dfce1a7ed2d86bf68f9bc6f560dc WatchSource:0}: Error finding container ccb55f5b04d1821ea1ec96786e76b571b766dfce1a7ed2d86bf68f9bc6f560dc: Status 404 returned error can't find the container with id ccb55f5b04d1821ea1ec96786e76b571b766dfce1a7ed2d86bf68f9bc6f560dc Apr 16 18:39:00.616368 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.616333 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9jk6\" (UniqueName: \"kubernetes.io/projected/2e17c4e0-04ae-49d0-8c52-562cf1661296-kube-api-access-w9jk6\") pod \"2e17c4e0-04ae-49d0-8c52-562cf1661296\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " Apr 16 18:39:00.616566 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.616403 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-config\") pod \"2e17c4e0-04ae-49d0-8c52-562cf1661296\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " Apr 16 18:39:00.616566 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.616483 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-service-ca\") pod \"2e17c4e0-04ae-49d0-8c52-562cf1661296\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " Apr 16 18:39:00.616566 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.616522 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-trusted-ca-bundle\") pod \"2e17c4e0-04ae-49d0-8c52-562cf1661296\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " Apr 16 18:39:00.616729 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.616608 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-oauth-config\") pod \"2e17c4e0-04ae-49d0-8c52-562cf1661296\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " Apr 16 18:39:00.616729 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.616657 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-oauth-serving-cert\") pod \"2e17c4e0-04ae-49d0-8c52-562cf1661296\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " Apr 16 18:39:00.616729 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.616684 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-serving-cert\") pod \"2e17c4e0-04ae-49d0-8c52-562cf1661296\" (UID: \"2e17c4e0-04ae-49d0-8c52-562cf1661296\") " Apr 16 18:39:00.616877 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.616841 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-config" (OuterVolumeSpecName: "console-config") pod "2e17c4e0-04ae-49d0-8c52-562cf1661296" (UID: "2e17c4e0-04ae-49d0-8c52-562cf1661296"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:39:00.616931 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.616877 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-service-ca" (OuterVolumeSpecName: "service-ca") pod "2e17c4e0-04ae-49d0-8c52-562cf1661296" (UID: "2e17c4e0-04ae-49d0-8c52-562cf1661296"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:39:00.616976 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.616929 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2e17c4e0-04ae-49d0-8c52-562cf1661296" (UID: "2e17c4e0-04ae-49d0-8c52-562cf1661296"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:39:00.617161 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.617125 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-service-ca\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:39:00.617215 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.617182 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-trusted-ca-bundle\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:39:00.617215 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.617193 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-config\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:39:00.617288 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.617244 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2e17c4e0-04ae-49d0-8c52-562cf1661296" (UID: "2e17c4e0-04ae-49d0-8c52-562cf1661296"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:39:00.618885 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.618860 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2e17c4e0-04ae-49d0-8c52-562cf1661296" (UID: "2e17c4e0-04ae-49d0-8c52-562cf1661296"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:39:00.618974 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.618892 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2e17c4e0-04ae-49d0-8c52-562cf1661296" (UID: "2e17c4e0-04ae-49d0-8c52-562cf1661296"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:39:00.618974 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.618905 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e17c4e0-04ae-49d0-8c52-562cf1661296-kube-api-access-w9jk6" (OuterVolumeSpecName: "kube-api-access-w9jk6") pod "2e17c4e0-04ae-49d0-8c52-562cf1661296" (UID: "2e17c4e0-04ae-49d0-8c52-562cf1661296"). InnerVolumeSpecName "kube-api-access-w9jk6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:39:00.718579 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.718538 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-oauth-config\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:39:00.718579 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.718577 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e17c4e0-04ae-49d0-8c52-562cf1661296-oauth-serving-cert\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:39:00.718579 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.718589 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e17c4e0-04ae-49d0-8c52-562cf1661296-console-serving-cert\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:39:00.718802 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:00.718598 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w9jk6\" (UniqueName: \"kubernetes.io/projected/2e17c4e0-04ae-49d0-8c52-562cf1661296-kube-api-access-w9jk6\") on node \"ip-10-0-135-146.ec2.internal\" DevicePath \"\"" Apr 16 18:39:01.192283 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:01.192240 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" event={"ID":"dcc00ead-054e-43b8-8a01-9b7238ac3473","Type":"ContainerStarted","Data":"ccb55f5b04d1821ea1ec96786e76b571b766dfce1a7ed2d86bf68f9bc6f560dc"} Apr 16 18:39:01.193672 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:01.193652 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-89ccddb4-ql8kd_2e17c4e0-04ae-49d0-8c52-562cf1661296/console/0.log" Apr 16 18:39:01.193794 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:01.193709 2580 generic.go:358] "Generic (PLEG): container finished" podID="2e17c4e0-04ae-49d0-8c52-562cf1661296" containerID="33f45120441c431e9033b5a1491eb374d450f618cb1b17af816dbc1e46073000" exitCode=2 Apr 16 18:39:01.193851 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:01.193810 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-89ccddb4-ql8kd" event={"ID":"2e17c4e0-04ae-49d0-8c52-562cf1661296","Type":"ContainerDied","Data":"33f45120441c431e9033b5a1491eb374d450f618cb1b17af816dbc1e46073000"} Apr 16 18:39:01.193851 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:01.193836 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-89ccddb4-ql8kd" Apr 16 18:39:01.193851 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:01.193846 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-89ccddb4-ql8kd" event={"ID":"2e17c4e0-04ae-49d0-8c52-562cf1661296","Type":"ContainerDied","Data":"8f19d1f4a66506536aaa4897198536b5cea900982b72a82dd1f55bc897cdb08e"} Apr 16 18:39:01.193990 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:01.193863 2580 scope.go:117] "RemoveContainer" containerID="33f45120441c431e9033b5a1491eb374d450f618cb1b17af816dbc1e46073000" Apr 16 18:39:01.205632 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:01.205611 2580 scope.go:117] "RemoveContainer" containerID="33f45120441c431e9033b5a1491eb374d450f618cb1b17af816dbc1e46073000" Apr 16 18:39:01.205981 ip-10-0-135-146 kubenswrapper[2580]: E0416 18:39:01.205944 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33f45120441c431e9033b5a1491eb374d450f618cb1b17af816dbc1e46073000\": container with ID starting with 33f45120441c431e9033b5a1491eb374d450f618cb1b17af816dbc1e46073000 not found: ID does not exist" containerID="33f45120441c431e9033b5a1491eb374d450f618cb1b17af816dbc1e46073000" Apr 16 18:39:01.206123 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:01.205980 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33f45120441c431e9033b5a1491eb374d450f618cb1b17af816dbc1e46073000"} err="failed to get container status \"33f45120441c431e9033b5a1491eb374d450f618cb1b17af816dbc1e46073000\": rpc error: code = NotFound desc = could not find container \"33f45120441c431e9033b5a1491eb374d450f618cb1b17af816dbc1e46073000\": container with ID starting with 33f45120441c431e9033b5a1491eb374d450f618cb1b17af816dbc1e46073000 not found: ID does not exist" Apr 16 18:39:01.238288 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:01.238237 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-89ccddb4-ql8kd"] Apr 16 18:39:01.246318 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:01.246287 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-89ccddb4-ql8kd"] Apr 16 18:39:01.541055 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:01.541010 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e17c4e0-04ae-49d0-8c52-562cf1661296" path="/var/lib/kubelet/pods/2e17c4e0-04ae-49d0-8c52-562cf1661296/volumes" Apr 16 18:39:03.020402 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:03.020361 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 18:39:03.020687 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:03.020432 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 18:39:03.205396 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:03.205349 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" event={"ID":"dcc00ead-054e-43b8-8a01-9b7238ac3473","Type":"ContainerStarted","Data":"11bd028d528e2a7eedec7f83bbb13d4fd4baa64f21e69d7888de5b558069980e"} Apr 16 18:39:03.205585 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:03.205539 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:39:03.239073 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:03.239001 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" podStartSLOduration=1.770598573 podStartE2EDuration="4.238979061s" podCreationTimestamp="2026-04-16 18:38:59 +0000 UTC" firstStartedPulling="2026-04-16 18:39:00.551709693 +0000 UTC m=+491.640973265" lastFinishedPulling="2026-04-16 18:39:03.020090168 +0000 UTC m=+494.109353753" observedRunningTime="2026-04-16 18:39:03.236437109 +0000 UTC m=+494.325700703" watchObservedRunningTime="2026-04-16 18:39:03.238979061 +0000 UTC m=+494.328242657" Apr 16 18:39:04.211276 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:39:04.211247 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qr4wz" Apr 16 18:47:59.224695 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.224657 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-75844779d6-dcxh5"] Apr 16 18:47:59.225100 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.225013 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e17c4e0-04ae-49d0-8c52-562cf1661296" containerName="console" Apr 16 18:47:59.225100 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.225024 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e17c4e0-04ae-49d0-8c52-562cf1661296" containerName="console" Apr 16 18:47:59.225100 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.225075 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e17c4e0-04ae-49d0-8c52-562cf1661296" containerName="console" Apr 16 18:47:59.227296 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.227276 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-75844779d6-dcxh5" Apr 16 18:47:59.230703 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.230681 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-zlbts\"" Apr 16 18:47:59.231610 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.231586 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:47:59.231709 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.231653 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:47:59.231709 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.231681 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 18:47:59.239220 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.238291 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-75844779d6-dcxh5"] Apr 16 18:47:59.356335 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.356283 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdphs\" (UniqueName: \"kubernetes.io/projected/8ee14d26-2997-481a-bc9f-5f9c2bf5f6bd-kube-api-access-gdphs\") pod \"llmisvc-controller-manager-75844779d6-dcxh5\" (UID: \"8ee14d26-2997-481a-bc9f-5f9c2bf5f6bd\") " pod="kserve/llmisvc-controller-manager-75844779d6-dcxh5" Apr 16 18:47:59.356528 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.356359 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ee14d26-2997-481a-bc9f-5f9c2bf5f6bd-cert\") pod \"llmisvc-controller-manager-75844779d6-dcxh5\" (UID: \"8ee14d26-2997-481a-bc9f-5f9c2bf5f6bd\") " pod="kserve/llmisvc-controller-manager-75844779d6-dcxh5" Apr 16 18:47:59.457286 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.457235 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdphs\" (UniqueName: \"kubernetes.io/projected/8ee14d26-2997-481a-bc9f-5f9c2bf5f6bd-kube-api-access-gdphs\") pod \"llmisvc-controller-manager-75844779d6-dcxh5\" (UID: \"8ee14d26-2997-481a-bc9f-5f9c2bf5f6bd\") " pod="kserve/llmisvc-controller-manager-75844779d6-dcxh5" Apr 16 18:47:59.457462 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.457305 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ee14d26-2997-481a-bc9f-5f9c2bf5f6bd-cert\") pod \"llmisvc-controller-manager-75844779d6-dcxh5\" (UID: \"8ee14d26-2997-481a-bc9f-5f9c2bf5f6bd\") " pod="kserve/llmisvc-controller-manager-75844779d6-dcxh5" Apr 16 18:47:59.459613 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.459583 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ee14d26-2997-481a-bc9f-5f9c2bf5f6bd-cert\") pod \"llmisvc-controller-manager-75844779d6-dcxh5\" (UID: \"8ee14d26-2997-481a-bc9f-5f9c2bf5f6bd\") " pod="kserve/llmisvc-controller-manager-75844779d6-dcxh5" Apr 16 18:47:59.466421 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.466398 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdphs\" (UniqueName: \"kubernetes.io/projected/8ee14d26-2997-481a-bc9f-5f9c2bf5f6bd-kube-api-access-gdphs\") pod \"llmisvc-controller-manager-75844779d6-dcxh5\" (UID: \"8ee14d26-2997-481a-bc9f-5f9c2bf5f6bd\") " pod="kserve/llmisvc-controller-manager-75844779d6-dcxh5" Apr 16 18:47:59.544198 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.544105 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-75844779d6-dcxh5" Apr 16 18:47:59.668055 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.668022 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-75844779d6-dcxh5"] Apr 16 18:47:59.670953 ip-10-0-135-146 kubenswrapper[2580]: W0416 18:47:59.670922 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8ee14d26_2997_481a_bc9f_5f9c2bf5f6bd.slice/crio-73d58b9b3cc7968f19c6b6eca0481d5f459095f80e541a5974e573f1b95fc43d WatchSource:0}: Error finding container 73d58b9b3cc7968f19c6b6eca0481d5f459095f80e541a5974e573f1b95fc43d: Status 404 returned error can't find the container with id 73d58b9b3cc7968f19c6b6eca0481d5f459095f80e541a5974e573f1b95fc43d Apr 16 18:47:59.672375 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:47:59.672354 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:48:00.040651 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:48:00.040618 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-75844779d6-dcxh5" event={"ID":"8ee14d26-2997-481a-bc9f-5f9c2bf5f6bd","Type":"ContainerStarted","Data":"73d58b9b3cc7968f19c6b6eca0481d5f459095f80e541a5974e573f1b95fc43d"} Apr 16 18:48:03.052275 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:48:03.052238 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-75844779d6-dcxh5" event={"ID":"8ee14d26-2997-481a-bc9f-5f9c2bf5f6bd","Type":"ContainerStarted","Data":"bef52626e49f8d081bcde673d9d8681bed9c39605f8d1aac8aab27332c85b987"} Apr 16 18:48:03.052567 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:48:03.052391 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-75844779d6-dcxh5" Apr 16 18:48:03.073059 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:48:03.073008 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-75844779d6-dcxh5" podStartSLOduration=0.816552154 podStartE2EDuration="4.072993096s" podCreationTimestamp="2026-04-16 18:47:59 +0000 UTC" firstStartedPulling="2026-04-16 18:47:59.672535597 +0000 UTC m=+1030.761799181" lastFinishedPulling="2026-04-16 18:48:02.928976551 +0000 UTC m=+1034.018240123" observedRunningTime="2026-04-16 18:48:03.071892479 +0000 UTC m=+1034.161156070" watchObservedRunningTime="2026-04-16 18:48:03.072993096 +0000 UTC m=+1034.162256691" Apr 16 18:48:34.058005 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:48:34.057934 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-75844779d6-dcxh5" Apr 16 18:59:55.448420 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:59:55.448388 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-qr4wz_dcc00ead-054e-43b8-8a01-9b7238ac3473/discovery/0.log" Apr 16 18:59:56.292432 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:59:56.292398 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-qr4wz_dcc00ead-054e-43b8-8a01-9b7238ac3473/discovery/0.log" Apr 16 18:59:57.110974 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:59:57.110945 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-cks8l_5f4f52a0-a6f3-4dc7-84a9-dfc0be467776/authorino/0.log" Apr 16 18:59:57.186139 ip-10-0-135-146 kubenswrapper[2580]: I0416 18:59:57.186111 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-84zc6_321dfd1b-d3ed-4840-a323-fab5e59a3836/manager/0.log" Apr 16 19:00:02.667266 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:02.667240 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-989rq_d2366368-d34f-4493-b764-4aa4105b1922/global-pull-secret-syncer/0.log" Apr 16 19:00:02.767557 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:02.767531 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-22wdj_15291a43-2092-47ea-b8ab-a7363155516e/konnectivity-agent/0.log" Apr 16 19:00:02.862191 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:02.862166 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-146.ec2.internal_c7acda094cc50b1e38ea78cc83b01fd2/haproxy/0.log" Apr 16 19:00:07.070450 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:07.070414 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-cks8l_5f4f52a0-a6f3-4dc7-84a9-dfc0be467776/authorino/0.log" Apr 16 19:00:07.208655 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:07.208621 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-84zc6_321dfd1b-d3ed-4840-a323-fab5e59a3836/manager/0.log" Apr 16 19:00:08.241930 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.241901 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_aa3404b2-7cf8-4cc7-b029-8cf2ef45e206/alertmanager/0.log" Apr 16 19:00:08.268188 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.268163 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_aa3404b2-7cf8-4cc7-b029-8cf2ef45e206/config-reloader/0.log" Apr 16 19:00:08.290726 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.290707 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_aa3404b2-7cf8-4cc7-b029-8cf2ef45e206/kube-rbac-proxy-web/0.log" Apr 16 19:00:08.312833 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.312814 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_aa3404b2-7cf8-4cc7-b029-8cf2ef45e206/kube-rbac-proxy/0.log" Apr 16 19:00:08.332748 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.332730 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_aa3404b2-7cf8-4cc7-b029-8cf2ef45e206/kube-rbac-proxy-metric/0.log" Apr 16 19:00:08.354456 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.354437 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_aa3404b2-7cf8-4cc7-b029-8cf2ef45e206/prom-label-proxy/0.log" Apr 16 19:00:08.375700 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.375678 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_aa3404b2-7cf8-4cc7-b029-8cf2ef45e206/init-config-reloader/0.log" Apr 16 19:00:08.443000 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.442980 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-6dwrf_51090ddf-429a-4d9e-a370-e6eb5bd84777/kube-state-metrics/0.log" Apr 16 19:00:08.463093 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.463073 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-6dwrf_51090ddf-429a-4d9e-a370-e6eb5bd84777/kube-rbac-proxy-main/0.log" Apr 16 19:00:08.486686 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.486668 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-6dwrf_51090ddf-429a-4d9e-a370-e6eb5bd84777/kube-rbac-proxy-self/0.log" Apr 16 19:00:08.511419 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.511365 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5776bc855b-mrbhh_e0336541-4641-42a2-9a31-69bf1667218c/metrics-server/0.log" Apr 16 19:00:08.535624 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.535604 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-j6nsg_31981e96-805e-41b8-a6dd-0d2bf7af45d8/monitoring-plugin/0.log" Apr 16 19:00:08.567986 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.567968 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-46rt7_4c490b1a-eb3a-4699-8602-4ab84dc9d32b/node-exporter/0.log" Apr 16 19:00:08.592292 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.592274 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-46rt7_4c490b1a-eb3a-4699-8602-4ab84dc9d32b/kube-rbac-proxy/0.log" Apr 16 19:00:08.613494 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.613474 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-46rt7_4c490b1a-eb3a-4699-8602-4ab84dc9d32b/init-textfile/0.log" Apr 16 19:00:08.809824 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.809744 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-5zjsb_929db997-d29c-44d5-9142-d93108884045/kube-rbac-proxy-main/0.log" Apr 16 19:00:08.840401 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.840368 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-5zjsb_929db997-d29c-44d5-9142-d93108884045/kube-rbac-proxy-self/0.log" Apr 16 19:00:08.860937 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.860915 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-5zjsb_929db997-d29c-44d5-9142-d93108884045/openshift-state-metrics/0.log" Apr 16 19:00:08.941162 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.941128 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73d5455b-55b6-44a7-8c92-e363b7d1b2f2/prometheus/0.log" Apr 16 19:00:08.968907 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.968870 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73d5455b-55b6-44a7-8c92-e363b7d1b2f2/config-reloader/0.log" Apr 16 19:00:08.997449 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:08.997420 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73d5455b-55b6-44a7-8c92-e363b7d1b2f2/thanos-sidecar/0.log" Apr 16 19:00:09.033081 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:09.033059 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73d5455b-55b6-44a7-8c92-e363b7d1b2f2/kube-rbac-proxy-web/0.log" Apr 16 19:00:09.066841 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:09.066774 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73d5455b-55b6-44a7-8c92-e363b7d1b2f2/kube-rbac-proxy/0.log" Apr 16 19:00:09.094799 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:09.094772 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73d5455b-55b6-44a7-8c92-e363b7d1b2f2/kube-rbac-proxy-thanos/0.log" Apr 16 19:00:09.122047 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:09.122017 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73d5455b-55b6-44a7-8c92-e363b7d1b2f2/init-config-reloader/0.log" Apr 16 19:00:09.157053 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:09.157027 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-25s9k_05c43021-6375-457e-857d-d95cce06e340/prometheus-operator/0.log" Apr 16 19:00:09.182249 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:09.182223 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-25s9k_05c43021-6375-457e-857d-d95cce06e340/kube-rbac-proxy/0.log" Apr 16 19:00:09.232402 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:09.232368 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-wh7nr_e1ab7fe5-93ad-411e-8815-6ff339e612fc/prometheus-operator-admission-webhook/0.log" Apr 16 19:00:09.360469 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:09.360392 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d54fc9d9-x7l2k_ca31ecdf-a575-4275-9d30-6215d448e698/thanos-query/0.log" Apr 16 19:00:09.384879 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:09.384857 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d54fc9d9-x7l2k_ca31ecdf-a575-4275-9d30-6215d448e698/kube-rbac-proxy-web/0.log" Apr 16 19:00:09.412021 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:09.411993 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d54fc9d9-x7l2k_ca31ecdf-a575-4275-9d30-6215d448e698/kube-rbac-proxy/0.log" Apr 16 19:00:09.434866 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:09.434840 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d54fc9d9-x7l2k_ca31ecdf-a575-4275-9d30-6215d448e698/prom-label-proxy/0.log" Apr 16 19:00:09.463350 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:09.463326 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d54fc9d9-x7l2k_ca31ecdf-a575-4275-9d30-6215d448e698/kube-rbac-proxy-rules/0.log" Apr 16 19:00:09.482884 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:09.482854 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d54fc9d9-x7l2k_ca31ecdf-a575-4275-9d30-6215d448e698/kube-rbac-proxy-metrics/0.log" Apr 16 19:00:11.593699 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.593672 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6844f85596-v6nmk_3bcd6742-aa1c-4ae5-baa3-c5856b2c17cd/console/0.log" Apr 16 19:00:11.740114 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.740078 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw"] Apr 16 19:00:11.743791 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.743771 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:11.746523 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.746500 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-djjwc\"/\"kube-root-ca.crt\"" Apr 16 19:00:11.746635 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.746507 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-djjwc\"/\"openshift-service-ca.crt\"" Apr 16 19:00:11.746698 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.746649 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-djjwc\"/\"default-dockercfg-bxwpq\"" Apr 16 19:00:11.753349 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.753325 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw"] Apr 16 19:00:11.817534 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.817501 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0f78481-c629-45b5-9d9e-4e721ca3b90b-lib-modules\") pod \"perf-node-gather-daemonset-5lqvw\" (UID: \"b0f78481-c629-45b5-9d9e-4e721ca3b90b\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:11.817534 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.817540 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tlms\" (UniqueName: \"kubernetes.io/projected/b0f78481-c629-45b5-9d9e-4e721ca3b90b-kube-api-access-9tlms\") pod \"perf-node-gather-daemonset-5lqvw\" (UID: \"b0f78481-c629-45b5-9d9e-4e721ca3b90b\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:11.817777 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.817571 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b0f78481-c629-45b5-9d9e-4e721ca3b90b-podres\") pod \"perf-node-gather-daemonset-5lqvw\" (UID: \"b0f78481-c629-45b5-9d9e-4e721ca3b90b\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:11.817777 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.817716 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0f78481-c629-45b5-9d9e-4e721ca3b90b-sys\") pod \"perf-node-gather-daemonset-5lqvw\" (UID: \"b0f78481-c629-45b5-9d9e-4e721ca3b90b\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:11.817777 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.817759 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b0f78481-c629-45b5-9d9e-4e721ca3b90b-proc\") pod \"perf-node-gather-daemonset-5lqvw\" (UID: \"b0f78481-c629-45b5-9d9e-4e721ca3b90b\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:11.919096 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.918995 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0f78481-c629-45b5-9d9e-4e721ca3b90b-sys\") pod \"perf-node-gather-daemonset-5lqvw\" (UID: \"b0f78481-c629-45b5-9d9e-4e721ca3b90b\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:11.919096 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.919053 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b0f78481-c629-45b5-9d9e-4e721ca3b90b-proc\") pod \"perf-node-gather-daemonset-5lqvw\" (UID: \"b0f78481-c629-45b5-9d9e-4e721ca3b90b\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:11.919096 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.919086 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0f78481-c629-45b5-9d9e-4e721ca3b90b-lib-modules\") pod \"perf-node-gather-daemonset-5lqvw\" (UID: \"b0f78481-c629-45b5-9d9e-4e721ca3b90b\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:11.919371 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.919111 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tlms\" (UniqueName: \"kubernetes.io/projected/b0f78481-c629-45b5-9d9e-4e721ca3b90b-kube-api-access-9tlms\") pod \"perf-node-gather-daemonset-5lqvw\" (UID: \"b0f78481-c629-45b5-9d9e-4e721ca3b90b\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:11.919371 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.919123 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b0f78481-c629-45b5-9d9e-4e721ca3b90b-proc\") pod \"perf-node-gather-daemonset-5lqvw\" (UID: \"b0f78481-c629-45b5-9d9e-4e721ca3b90b\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:11.919371 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.919125 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0f78481-c629-45b5-9d9e-4e721ca3b90b-sys\") pod \"perf-node-gather-daemonset-5lqvw\" (UID: \"b0f78481-c629-45b5-9d9e-4e721ca3b90b\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:11.919371 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.919180 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b0f78481-c629-45b5-9d9e-4e721ca3b90b-podres\") pod \"perf-node-gather-daemonset-5lqvw\" (UID: \"b0f78481-c629-45b5-9d9e-4e721ca3b90b\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:11.919371 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.919247 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0f78481-c629-45b5-9d9e-4e721ca3b90b-lib-modules\") pod \"perf-node-gather-daemonset-5lqvw\" (UID: \"b0f78481-c629-45b5-9d9e-4e721ca3b90b\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:11.919371 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.919336 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b0f78481-c629-45b5-9d9e-4e721ca3b90b-podres\") pod \"perf-node-gather-daemonset-5lqvw\" (UID: \"b0f78481-c629-45b5-9d9e-4e721ca3b90b\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:11.927780 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:11.927760 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tlms\" (UniqueName: \"kubernetes.io/projected/b0f78481-c629-45b5-9d9e-4e721ca3b90b-kube-api-access-9tlms\") pod \"perf-node-gather-daemonset-5lqvw\" (UID: \"b0f78481-c629-45b5-9d9e-4e721ca3b90b\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:12.054963 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:12.054927 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:12.179199 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:12.179171 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw"] Apr 16 19:00:12.181665 ip-10-0-135-146 kubenswrapper[2580]: W0416 19:00:12.181636 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb0f78481_c629_45b5_9d9e_4e721ca3b90b.slice/crio-99e7083a019d89f03780af62d9d7d0a12b016404b9e190031b2ddde91e3aa96a WatchSource:0}: Error finding container 99e7083a019d89f03780af62d9d7d0a12b016404b9e190031b2ddde91e3aa96a: Status 404 returned error can't find the container with id 99e7083a019d89f03780af62d9d7d0a12b016404b9e190031b2ddde91e3aa96a Apr 16 19:00:12.183232 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:12.183205 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:00:12.512315 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:12.512280 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" event={"ID":"b0f78481-c629-45b5-9d9e-4e721ca3b90b","Type":"ContainerStarted","Data":"2e1b9cb2d6d40a930751023e20999a651d69265ace59edd3b8c3d15e211d3eae"} Apr 16 19:00:12.512315 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:12.512315 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" event={"ID":"b0f78481-c629-45b5-9d9e-4e721ca3b90b","Type":"ContainerStarted","Data":"99e7083a019d89f03780af62d9d7d0a12b016404b9e190031b2ddde91e3aa96a"} Apr 16 19:00:12.512541 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:12.512411 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:12.531840 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:12.531788 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" podStartSLOduration=1.531770903 podStartE2EDuration="1.531770903s" podCreationTimestamp="2026-04-16 19:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:00:12.530890126 +0000 UTC m=+1763.620153720" watchObservedRunningTime="2026-04-16 19:00:12.531770903 +0000 UTC m=+1763.621034645" Apr 16 19:00:12.939827 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:12.939758 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m2f22_0a157f4b-852b-4fc1-867b-72319f3a23ef/dns/0.log" Apr 16 19:00:12.968088 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:12.968066 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m2f22_0a157f4b-852b-4fc1-867b-72319f3a23ef/kube-rbac-proxy/0.log" Apr 16 19:00:13.046050 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:13.046014 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sxsqt_e25c8650-bbd6-4127-a3ea-3a79b45748b6/dns-node-resolver/0.log" Apr 16 19:00:13.510556 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:13.510523 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-65688f687b-mwsdh_9922e3fb-d0ae-4fbd-b96a-81793a1f521a/registry/0.log" Apr 16 19:00:13.582638 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:13.582606 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jz4gk_3622caf8-9f1b-49f3-8219-6df05c25252f/node-ca/0.log" Apr 16 19:00:14.442164 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:14.442123 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-qr4wz_dcc00ead-054e-43b8-8a01-9b7238ac3473/discovery/0.log" Apr 16 19:00:14.995506 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:14.995469 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-m6kqv_5db233fe-b415-4303-a7db-96df89fba6f1/serve-healthcheck-canary/0.log" Apr 16 19:00:15.476666 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:15.476645 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6d74h_f721ab0c-9cef-4965-905d-b47537a6ad94/kube-rbac-proxy/0.log" Apr 16 19:00:15.500509 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:15.500476 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6d74h_f721ab0c-9cef-4965-905d-b47537a6ad94/exporter/0.log" Apr 16 19:00:15.526273 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:15.526243 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6d74h_f721ab0c-9cef-4965-905d-b47537a6ad94/extractor/0.log" Apr 16 19:00:18.271375 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:18.271341 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7fd84c546d-q2vh7_0e2be300-3e78-4416-8f7f-ac42cbb89d9e/manager/0.log" Apr 16 19:00:18.526348 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:18.526282 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-5lqvw" Apr 16 19:00:18.933467 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:18.933345 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-75844779d6-dcxh5_8ee14d26-2997-481a-bc9f-5f9c2bf5f6bd/manager/0.log" Apr 16 19:00:25.673436 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:25.673409 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zrvtq_c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9/kube-multus-additional-cni-plugins/0.log" Apr 16 19:00:25.694353 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:25.694329 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zrvtq_c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9/egress-router-binary-copy/0.log" Apr 16 19:00:25.717363 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:25.717339 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zrvtq_c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9/cni-plugins/0.log" Apr 16 19:00:25.742825 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:25.742797 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zrvtq_c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9/bond-cni-plugin/0.log" Apr 16 19:00:25.762952 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:25.762926 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zrvtq_c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9/routeoverride-cni/0.log" Apr 16 19:00:25.785707 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:25.785674 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zrvtq_c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9/whereabouts-cni-bincopy/0.log" Apr 16 19:00:25.808085 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:25.808063 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zrvtq_c7fce7b3-ec9b-49f6-8e70-8a22ed2f44c9/whereabouts-cni/0.log" Apr 16 19:00:25.886689 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:25.886653 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ncxt7_e1300a99-4d7a-47e3-9d8e-404608c14ae7/kube-multus/0.log" Apr 16 19:00:26.049523 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:26.049497 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sjjgw_642f0536-3a1b-4d5c-bb3d-e7128392b218/network-metrics-daemon/0.log" Apr 16 19:00:26.074654 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:26.074631 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sjjgw_642f0536-3a1b-4d5c-bb3d-e7128392b218/kube-rbac-proxy/0.log" Apr 16 19:00:26.893974 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:26.893946 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6nzqh_6a615bb3-b76b-4c92-9085-2d164914c2aa/ovn-controller/0.log" Apr 16 19:00:26.920764 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:26.920735 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6nzqh_6a615bb3-b76b-4c92-9085-2d164914c2aa/ovn-acl-logging/0.log" Apr 16 19:00:26.937644 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:26.937619 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6nzqh_6a615bb3-b76b-4c92-9085-2d164914c2aa/kube-rbac-proxy-node/0.log" Apr 16 19:00:26.960532 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:26.960510 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6nzqh_6a615bb3-b76b-4c92-9085-2d164914c2aa/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:00:26.981540 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:26.981515 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6nzqh_6a615bb3-b76b-4c92-9085-2d164914c2aa/northd/0.log" Apr 16 19:00:27.002599 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:27.002578 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6nzqh_6a615bb3-b76b-4c92-9085-2d164914c2aa/nbdb/0.log" Apr 16 19:00:27.023029 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:27.023004 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6nzqh_6a615bb3-b76b-4c92-9085-2d164914c2aa/sbdb/0.log" Apr 16 19:00:27.116042 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:27.116017 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6nzqh_6a615bb3-b76b-4c92-9085-2d164914c2aa/ovnkube-controller/0.log" Apr 16 19:00:28.999263 ip-10-0-135-146 kubenswrapper[2580]: I0416 19:00:28.999239 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-2bwhf_ba95d7cd-292c-41ec-8417-d3768d65716d/network-check-target-container/0.log"