Apr 16 23:25:59.251017 ip-10-0-131-43 systemd[1]: Starting Kubernetes Kubelet... Apr 16 23:25:59.744840 ip-10-0-131-43 kubenswrapper[2564]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:25:59.744840 ip-10-0-131-43 kubenswrapper[2564]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 23:25:59.744840 ip-10-0-131-43 kubenswrapper[2564]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:25:59.744840 ip-10-0-131-43 kubenswrapper[2564]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 23:25:59.744840 ip-10-0-131-43 kubenswrapper[2564]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:25:59.748415 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.748324 2564 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 23:25:59.751619 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751603 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:25:59.751619 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751619 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751623 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751627 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751631 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751634 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751637 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751640 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751643 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751646 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751649 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751652 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751655 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751657 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751660 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751663 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751665 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751668 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751678 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751681 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:25:59.751689 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751683 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751686 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751689 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751692 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751706 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751709 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751712 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751715 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751718 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751721 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751723 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751726 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751729 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751732 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751735 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751738 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751740 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751743 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751745 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751747 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:25:59.752192 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751750 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751752 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751755 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751757 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751760 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751762 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751765 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751767 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751770 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751772 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751775 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751778 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751780 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751783 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751787 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751790 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751793 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751795 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751798 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751802 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:25:59.752733 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751804 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751807 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751809 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751812 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751814 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751817 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751819 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751822 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751825 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751827 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751830 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751840 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751843 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751845 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751848 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751850 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751853 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751856 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751858 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751862 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:25:59.753213 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751865 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751870 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751874 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751877 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751880 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.751884 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752356 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752363 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752366 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752369 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752373 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752377 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752380 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752383 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752386 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752389 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752391 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752394 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752396 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:25:59.753693 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752399 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752402 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752404 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752407 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752410 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752412 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752415 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752417 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752420 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752422 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752424 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752427 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752429 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752432 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752434 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752437 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752440 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752443 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752446 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752449 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:25:59.754183 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752451 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752454 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752457 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752459 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752462 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752464 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752467 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752469 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752471 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752474 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752476 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752479 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752481 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752484 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752486 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752488 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752491 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752493 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752496 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:25:59.754782 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752499 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752501 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752503 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752506 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752510 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752512 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752515 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752518 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752520 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752523 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752526 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752528 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752531 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752533 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752536 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752538 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752541 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752543 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752546 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752548 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:25:59.755601 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752552 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752555 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752558 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752561 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752563 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752566 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752568 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752572 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752574 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752577 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752579 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752582 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752585 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.752587 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753902 2564 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753914 2564 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753920 2564 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753924 2564 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753929 2564 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753932 2564 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753937 2564 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 23:25:59.756172 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753941 2564 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753945 2564 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753948 2564 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753952 2564 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753955 2564 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753960 2564 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753963 2564 flags.go:64] FLAG: --cgroup-root="" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753966 2564 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753969 2564 flags.go:64] FLAG: --client-ca-file="" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753972 2564 flags.go:64] FLAG: --cloud-config="" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753975 2564 flags.go:64] FLAG: --cloud-provider="external" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753977 2564 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753982 2564 flags.go:64] FLAG: --cluster-domain="" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753985 2564 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753988 2564 flags.go:64] FLAG: --config-dir="" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753991 2564 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753994 2564 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.753998 2564 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754002 2564 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754005 2564 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754008 2564 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754011 2564 flags.go:64] FLAG: --contention-profiling="false" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754014 2564 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754018 2564 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754021 2564 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 23:25:59.756797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754024 2564 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754028 2564 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754031 2564 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754034 2564 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754037 2564 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754040 2564 flags.go:64] FLAG: --enable-server="true" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754043 2564 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754046 2564 flags.go:64] FLAG: --event-burst="100" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754050 2564 flags.go:64] FLAG: --event-qps="50" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754053 2564 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754056 2564 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754059 2564 flags.go:64] FLAG: --eviction-hard="" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754063 2564 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754066 2564 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754069 2564 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754072 2564 flags.go:64] FLAG: --eviction-soft="" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754075 2564 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754078 2564 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754081 2564 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754084 2564 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754086 2564 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754089 2564 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754092 2564 flags.go:64] FLAG: --feature-gates="" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754096 2564 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754099 2564 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 23:25:59.757427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754103 2564 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754106 2564 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754109 2564 flags.go:64] FLAG: --healthz-port="10248" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754112 2564 flags.go:64] FLAG: --help="false" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754115 2564 flags.go:64] FLAG: --hostname-override="ip-10-0-131-43.ec2.internal" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754118 2564 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754121 2564 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754124 2564 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754128 2564 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754131 2564 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754134 2564 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754137 2564 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754140 2564 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754143 2564 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754146 2564 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754149 2564 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754152 2564 flags.go:64] FLAG: --kube-reserved="" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754155 2564 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754157 2564 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754161 2564 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754163 2564 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754166 2564 flags.go:64] FLAG: --lock-file="" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754169 2564 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754172 2564 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 23:25:59.758055 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754176 2564 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754181 2564 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754184 2564 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754188 2564 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754191 2564 flags.go:64] FLAG: --logging-format="text" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754193 2564 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754197 2564 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754200 2564 flags.go:64] FLAG: --manifest-url="" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754203 2564 flags.go:64] FLAG: --manifest-url-header="" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754208 2564 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754211 2564 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754215 2564 flags.go:64] FLAG: --max-pods="110" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754218 2564 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754221 2564 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754225 2564 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754228 2564 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754231 2564 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754234 2564 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754237 2564 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754245 2564 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754248 2564 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754251 2564 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754254 2564 flags.go:64] FLAG: --pod-cidr="" Apr 16 23:25:59.758672 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754257 2564 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754262 2564 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754265 2564 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754268 2564 flags.go:64] FLAG: --pods-per-core="0" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754271 2564 flags.go:64] FLAG: --port="10250" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754274 2564 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754277 2564 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b465534f47b0e415" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754280 2564 flags.go:64] FLAG: --qos-reserved="" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754283 2564 flags.go:64] FLAG: --read-only-port="10255" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754287 2564 flags.go:64] FLAG: --register-node="true" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754290 2564 flags.go:64] FLAG: --register-schedulable="true" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754293 2564 flags.go:64] FLAG: --register-with-taints="" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754297 2564 flags.go:64] FLAG: --registry-burst="10" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754300 2564 flags.go:64] FLAG: --registry-qps="5" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754302 2564 flags.go:64] FLAG: --reserved-cpus="" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754305 2564 flags.go:64] FLAG: --reserved-memory="" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754310 2564 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754313 2564 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754316 2564 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754319 2564 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754322 2564 flags.go:64] FLAG: --runonce="false" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754325 2564 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754328 2564 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754331 2564 flags.go:64] FLAG: --seccomp-default="false" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754334 2564 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754337 2564 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 23:25:59.759248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754340 2564 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754344 2564 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754347 2564 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754349 2564 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754352 2564 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754355 2564 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754358 2564 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754361 2564 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754364 2564 flags.go:64] FLAG: --system-cgroups="" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754366 2564 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754372 2564 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754375 2564 flags.go:64] FLAG: --tls-cert-file="" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754378 2564 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754382 2564 flags.go:64] FLAG: --tls-min-version="" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754384 2564 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754387 2564 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754390 2564 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754393 2564 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754396 2564 flags.go:64] FLAG: --v="2" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754400 2564 flags.go:64] FLAG: --version="false" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754404 2564 flags.go:64] FLAG: --vmodule="" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754409 2564 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.754412 2564 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.754965 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:25:59.759935 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.754990 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.754999 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755007 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755015 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755023 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755030 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755037 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755045 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755054 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755062 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755070 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755077 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755084 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755097 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755103 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755107 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755112 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755116 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755120 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:25:59.760503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755124 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755128 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755132 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755136 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755140 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755145 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755149 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755158 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755163 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755168 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755172 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755176 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755180 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755184 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755188 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755193 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755197 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755201 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755206 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755215 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755219 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:25:59.761007 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755223 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755228 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755233 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755237 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755241 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755245 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755250 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755253 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755257 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755261 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755266 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755275 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755279 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755284 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755288 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755295 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755299 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755303 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755307 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755312 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:25:59.761521 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755316 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755321 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755325 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755337 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755342 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755347 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755352 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755356 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755361 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755365 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755370 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755375 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755380 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755384 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755390 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755395 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755406 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755410 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755415 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:25:59.762083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755421 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:25:59.762547 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755426 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:25:59.762547 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755431 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:25:59.762547 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755435 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:25:59.762547 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755439 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:25:59.762547 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.755443 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:25:59.762547 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.756246 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:25:59.763129 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.763108 2564 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 23:25:59.763166 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.763130 2564 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 23:25:59.763197 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763192 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:25:59.763227 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763198 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:25:59.763227 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763201 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:25:59.763227 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763205 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:25:59.763227 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763208 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:25:59.763227 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763211 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:25:59.763227 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763214 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:25:59.763227 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763217 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:25:59.763227 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763220 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:25:59.763227 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763223 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:25:59.763227 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763225 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:25:59.763227 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763228 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:25:59.763227 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763231 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763235 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763240 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763242 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763245 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763247 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763250 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763252 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763255 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763259 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763263 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763266 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763269 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763272 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763275 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763277 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763280 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763282 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763285 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:25:59.763518 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763288 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763291 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763293 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763296 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763298 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763301 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763303 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763306 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763309 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763311 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763314 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763316 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763319 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763321 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763324 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763328 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763330 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763333 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763335 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763338 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:25:59.764040 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763340 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763343 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763346 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763349 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763351 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763354 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763357 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763360 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763362 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763365 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763367 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763369 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763372 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763374 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763377 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763380 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763382 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763384 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763387 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763389 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:25:59.764550 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763392 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:25:59.765056 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763394 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:25:59.765056 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763397 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:25:59.765056 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763400 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:25:59.765056 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763402 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:25:59.765056 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763405 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:25:59.765056 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763408 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:25:59.765056 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763411 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:25:59.765056 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763413 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:25:59.765056 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763416 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:25:59.765056 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763418 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:25:59.765056 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763421 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:25:59.765056 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763424 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:25:59.765056 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763426 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:25:59.765056 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763429 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:25:59.765056 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.763434 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763529 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763533 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763537 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763539 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763542 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763545 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763547 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763551 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763555 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763558 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763560 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763563 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763566 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763568 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763571 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763573 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763576 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763579 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:25:59.765424 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763581 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763584 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763586 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763589 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763591 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763595 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763598 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763600 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763603 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763605 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763608 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763611 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763614 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763616 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763619 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763621 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763624 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763627 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763629 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763631 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:25:59.765896 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763634 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763636 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763639 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763642 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763644 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763646 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763649 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763651 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763654 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763656 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763659 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763661 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763663 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763666 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763668 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763671 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763674 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763679 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763682 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763685 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:25:59.766398 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763688 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763691 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763694 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763709 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763712 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763715 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763718 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763720 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763723 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763726 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763728 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763731 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763734 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763737 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763739 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763742 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763744 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763747 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763750 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763752 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:25:59.766891 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763755 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:25:59.767373 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763758 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:25:59.767373 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763760 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:25:59.767373 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763763 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:25:59.767373 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763766 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:25:59.767373 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763768 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:25:59.767373 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763771 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:25:59.767373 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:25:59.763774 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:25:59.767373 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.763779 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:25:59.767373 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.764653 2564 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 23:25:59.770767 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.770752 2564 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 23:25:59.771934 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.771922 2564 server.go:1019] "Starting client certificate rotation" Apr 16 23:25:59.772049 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.772030 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 23:25:59.772112 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.772084 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 23:25:59.800875 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.800851 2564 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 23:25:59.806270 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.806248 2564 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 23:25:59.821633 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.821606 2564 log.go:25] "Validated CRI v1 runtime API" Apr 16 23:25:59.825849 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.825827 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 23:25:59.828065 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.828041 2564 log.go:25] "Validated CRI v1 image API" Apr 16 23:25:59.830014 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.829991 2564 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 23:25:59.834394 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.834370 2564 fs.go:135] Filesystem UUIDs: map[60d85cde-5d86-4de9-b565-a653b47a10ec:/dev/nvme0n1p4 73ff3078-8415-4b44-8f99-a657bd07bc0b:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 23:25:59.834474 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.834395 2564 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 23:25:59.840415 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.840303 2564 manager.go:217] Machine: {Timestamp:2026-04-16 23:25:59.838176965 +0000 UTC m=+0.454208435 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099329 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24b9862cdff427aedf9ac91a6a5785 SystemUUID:ec24b986-2cdf-f427-aedf-9ac91a6a5785 BootID:dd30c240-edf4-48c0-8c85-26e8eaa6af36 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6d:20:79:6e:b5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6d:20:79:6e:b5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f6:72:9b:49:bc:f4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 23:25:59.840415 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.840413 2564 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 23:25:59.840557 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.840545 2564 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 23:25:59.841558 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.841531 2564 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 23:25:59.841712 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.841559 2564 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-43.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 23:25:59.841759 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.841722 2564 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 23:25:59.841759 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.841730 2564 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 23:25:59.841759 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.841743 2564 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 23:25:59.842732 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.842721 2564 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 23:25:59.843565 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.843556 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:25:59.843676 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.843668 2564 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 23:25:59.846405 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.846393 2564 kubelet.go:491] "Attempting to sync node with API server" Apr 16 23:25:59.846453 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.846414 2564 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 23:25:59.846453 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.846428 2564 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 23:25:59.846453 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.846439 2564 kubelet.go:397] "Adding apiserver pod source" Apr 16 23:25:59.846453 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.846452 2564 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 23:25:59.847534 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.847518 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 23:25:59.847625 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.847540 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 23:25:59.850896 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.850861 2564 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 23:25:59.850896 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.850890 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nr9bq" Apr 16 23:25:59.852426 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.852410 2564 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 23:25:59.854262 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.854250 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 23:25:59.854313 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.854267 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 23:25:59.854313 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.854274 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 23:25:59.854313 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.854280 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 23:25:59.854313 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.854286 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 23:25:59.854313 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.854292 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 23:25:59.854313 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.854297 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 23:25:59.854313 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.854303 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 23:25:59.854313 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.854310 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 23:25:59.854313 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.854317 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 23:25:59.854539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.854325 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 23:25:59.854539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.854335 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 23:25:59.855275 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.855264 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 23:25:59.855308 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.855279 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 23:25:59.857888 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.857873 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nr9bq" Apr 16 23:25:59.858029 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:25:59.858001 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 23:25:59.858029 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:25:59.858011 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-43.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 23:25:59.858103 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.858037 2564 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-43.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 23:25:59.859103 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.859090 2564 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 23:25:59.859156 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.859127 2564 server.go:1295] "Started kubelet" Apr 16 23:25:59.859245 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.859219 2564 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 23:25:59.859245 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.859214 2564 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 23:25:59.859330 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.859272 2564 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 23:25:59.860044 ip-10-0-131-43 systemd[1]: Started Kubernetes Kubelet. Apr 16 23:25:59.862489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.862473 2564 server.go:317] "Adding debug handlers to kubelet server" Apr 16 23:25:59.866376 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.866348 2564 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 23:25:59.870117 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.870100 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 23:25:59.870782 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:25:59.870757 2564 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 23:25:59.870889 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.870821 2564 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 23:25:59.871462 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.871446 2564 factory.go:55] Registering systemd factory Apr 16 23:25:59.871462 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.871464 2564 factory.go:223] Registration of the systemd container factory successfully Apr 16 23:25:59.871640 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.871625 2564 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 23:25:59.871716 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.871628 2564 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 23:25:59.871716 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.871660 2564 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 23:25:59.871716 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.871677 2564 factory.go:153] Registering CRI-O factory Apr 16 23:25:59.871716 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.871712 2564 factory.go:223] Registration of the crio container factory successfully Apr 16 23:25:59.871896 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.871741 2564 reconstruct.go:97] "Volume reconstruction finished" Apr 16 23:25:59.871896 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.871750 2564 reconciler.go:26] "Reconciler: start to sync state" Apr 16 23:25:59.871896 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.871767 2564 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 23:25:59.871896 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.871795 2564 factory.go:103] Registering Raw factory Apr 16 23:25:59.871896 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.871813 2564 manager.go:1196] Started watching for new ooms in manager Apr 16 23:25:59.872131 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:25:59.871926 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-43.ec2.internal\" not found" Apr 16 23:25:59.872264 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.872249 2564 manager.go:319] Starting recovery of all containers Apr 16 23:25:59.876119 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.876088 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:25:59.879598 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:25:59.879576 2564 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-43.ec2.internal\" not found" node="ip-10-0-131-43.ec2.internal" Apr 16 23:25:59.882303 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.882284 2564 manager.go:324] Recovery completed Apr 16 23:25:59.883954 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:25:59.883882 2564 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 23:25:59.887270 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.887257 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:25:59.889989 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.889974 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:25:59.890053 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.890002 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:25:59.890053 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.890013 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:25:59.890504 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.890483 2564 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 23:25:59.890560 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.890504 2564 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 23:25:59.890560 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.890522 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:25:59.892952 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.892939 2564 policy_none.go:49] "None policy: Start" Apr 16 23:25:59.892991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.892956 2564 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 23:25:59.892991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.892965 2564 state_mem.go:35] "Initializing new in-memory state store" Apr 16 23:25:59.942811 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.937298 2564 manager.go:341] "Starting Device Plugin manager" Apr 16 23:25:59.942811 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:25:59.937329 2564 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 23:25:59.942811 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.937339 2564 server.go:85] "Starting device plugin registration server" Apr 16 23:25:59.942811 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.937826 2564 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 23:25:59.942811 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.937844 2564 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 23:25:59.942811 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.938055 2564 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 23:25:59.942811 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.938132 2564 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 23:25:59.942811 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.938141 2564 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 23:25:59.942811 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:25:59.939215 2564 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 23:25:59.942811 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:25:59.939268 2564 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-43.ec2.internal\" not found" Apr 16 23:25:59.976201 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.976155 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 23:25:59.977382 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.977367 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 23:25:59.977460 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.977396 2564 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 23:25:59.977460 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.977419 2564 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 23:25:59.977460 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.977425 2564 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 23:25:59.977603 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:25:59.977528 2564 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 23:25:59.979985 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:25:59.979966 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:26:00.038586 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.038491 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:26:00.039469 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.039449 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:26:00.039585 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.039490 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:26:00.039585 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.039508 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:26:00.039585 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.039539 2564 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.047783 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.047759 2564 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.047937 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:00.047795 2564 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-43.ec2.internal\": node \"ip-10-0-131-43.ec2.internal\" not found" Apr 16 23:26:00.060777 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:00.060745 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-43.ec2.internal\" not found" Apr 16 23:26:00.078606 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.078580 2564 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-43.ec2.internal"] Apr 16 23:26:00.078681 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.078654 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:26:00.079656 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.079637 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:26:00.079753 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.079674 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:26:00.079753 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.079688 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:26:00.081159 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.081146 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:26:00.081304 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.081288 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.081350 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.081318 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:26:00.081875 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.081861 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:26:00.081938 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.081863 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:26:00.081938 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.081922 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:26:00.081938 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.081937 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:26:00.082041 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.081892 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:26:00.082041 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.081973 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:26:00.083155 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.083139 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.083225 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.083163 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:26:00.083782 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.083767 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:26:00.083849 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.083795 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:26:00.083849 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.083806 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:26:00.097237 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:00.097217 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-43.ec2.internal\" not found" node="ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.101095 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:00.101080 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-43.ec2.internal\" not found" node="ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.160887 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:00.160850 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-43.ec2.internal\" not found" Apr 16 23:26:00.173111 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.173090 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/24f90fad8c4786333edb9df978f6d7c6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal\" (UID: \"24f90fad8c4786333edb9df978f6d7c6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.173189 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.173118 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8c6c197c2824d262900a926e3ff6a96c-config\") pod \"kube-apiserver-proxy-ip-10-0-131-43.ec2.internal\" (UID: \"8c6c197c2824d262900a926e3ff6a96c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.173189 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.173143 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/24f90fad8c4786333edb9df978f6d7c6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal\" (UID: \"24f90fad8c4786333edb9df978f6d7c6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.261605 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:00.261574 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-43.ec2.internal\" not found" Apr 16 23:26:00.273972 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.273944 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/24f90fad8c4786333edb9df978f6d7c6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal\" (UID: \"24f90fad8c4786333edb9df978f6d7c6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.274071 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.273981 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/24f90fad8c4786333edb9df978f6d7c6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal\" (UID: \"24f90fad8c4786333edb9df978f6d7c6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.274071 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.274006 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8c6c197c2824d262900a926e3ff6a96c-config\") pod \"kube-apiserver-proxy-ip-10-0-131-43.ec2.internal\" (UID: \"8c6c197c2824d262900a926e3ff6a96c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.274071 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.274056 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/24f90fad8c4786333edb9df978f6d7c6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal\" (UID: \"24f90fad8c4786333edb9df978f6d7c6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.274172 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.274072 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8c6c197c2824d262900a926e3ff6a96c-config\") pod \"kube-apiserver-proxy-ip-10-0-131-43.ec2.internal\" (UID: \"8c6c197c2824d262900a926e3ff6a96c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.274172 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.274055 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/24f90fad8c4786333edb9df978f6d7c6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal\" (UID: \"24f90fad8c4786333edb9df978f6d7c6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.362449 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:00.362383 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-43.ec2.internal\" not found" Apr 16 23:26:00.398949 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.398917 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.403770 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.403749 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.463527 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:00.463492 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-43.ec2.internal\" not found" Apr 16 23:26:00.564045 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:00.564013 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-43.ec2.internal\" not found" Apr 16 23:26:00.664627 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:00.664548 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-43.ec2.internal\" not found" Apr 16 23:26:00.765075 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:00.765044 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-43.ec2.internal\" not found" Apr 16 23:26:00.771234 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.771219 2564 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 23:26:00.771394 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.771374 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 23:26:00.771477 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.771390 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 23:26:00.789009 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.788982 2564 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:26:00.846756 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.846717 2564 apiserver.go:52] "Watching apiserver" Apr 16 23:26:00.858479 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.858455 2564 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 23:26:00.859558 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.859533 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b","openshift-dns/node-resolver-4kt5z","openshift-image-registry/node-ca-dg2kq","openshift-multus/multus-additional-cni-plugins-wnp9b","openshift-multus/multus-mwc9h","openshift-network-operator/iptables-alerter-wm22k","openshift-cluster-node-tuning-operator/tuned-qh8gv","openshift-multus/network-metrics-daemon-zp26z","openshift-network-diagnostics/network-check-target-rl6cs","openshift-ovn-kubernetes/ovnkube-node-md28x","kube-system/konnectivity-agent-wmsm7"] Apr 16 23:26:00.859669 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.859566 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 23:20:59 +0000 UTC" deadline="2027-11-26 22:02:34.67609067 +0000 UTC" Apr 16 23:26:00.859669 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.859597 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14134h36m33.816496701s" Apr 16 23:26:00.862220 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.862196 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.863341 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.863323 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4kt5z" Apr 16 23:26:00.863440 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.863413 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dg2kq" Apr 16 23:26:00.864330 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.864311 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 23:26:00.864572 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.864552 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 23:26:00.864653 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.864612 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4d7bt\"" Apr 16 23:26:00.864840 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.864812 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 23:26:00.864943 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.864868 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.865535 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.865517 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 23:26:00.865631 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.865567 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pkmwl\"" Apr 16 23:26:00.865692 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.865676 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 23:26:00.865840 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.865824 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 23:26:00.865840 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.865837 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-q2d9s\"" Apr 16 23:26:00.865950 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.865860 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 23:26:00.865950 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.865938 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 23:26:00.867435 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.867073 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 23:26:00.867435 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.867215 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 23:26:00.867725 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.867686 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 23:26:00.867833 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.867756 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wwdv4\"" Apr 16 23:26:00.868349 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.868329 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 23:26:00.868422 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.868354 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 23:26:00.869362 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.869272 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wm22k" Apr 16 23:26:00.870407 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.870392 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 23:26:00.870629 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.870606 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.870737 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.870679 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.871026 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.871009 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.871362 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.871332 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7lzd4\"" Apr 16 23:26:00.871362 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.871334 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 23:26:00.871467 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.871405 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:26:00.871876 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.871862 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 23:26:00.872189 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.872174 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:00.872271 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:00.872248 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:00.872764 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.872744 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:26:00.872764 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.872752 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 23:26:00.872910 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.872774 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 23:26:00.872910 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.872789 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-rq8fl\"" Apr 16 23:26:00.872910 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.872809 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-q6zqf\"" Apr 16 23:26:00.873471 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.873455 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:00.873536 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:00.873504 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:00.874781 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.874761 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.875919 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.875901 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wmsm7" Apr 16 23:26:00.876608 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876587 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-sysctl-d\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.876689 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876613 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-run\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.876689 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876627 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 23:26:00.876689 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876632 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5c265860-ea8b-4315-acf6-bbb9ee728fe8-os-release\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.876689 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876660 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-var-lib-cni-multus\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.876926 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876715 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5aea2741-baa5-486d-8a0f-9eef53a7f27a-tmp-dir\") pod \"node-resolver-4kt5z\" (UID: \"5aea2741-baa5-486d-8a0f-9eef53a7f27a\") " pod="openshift-dns/node-resolver-4kt5z" Apr 16 23:26:00.876926 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876741 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-multus-socket-dir-parent\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.876926 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876759 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 23:26:00.876926 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876764 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-multus-conf-dir\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.876926 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876829 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-sysconfig\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.876926 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876858 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fa4aee53-dd23-4d2a-9b51-9a0d0822c22a-host\") pod \"node-ca-dg2kq\" (UID: \"fa4aee53-dd23-4d2a-9b51-9a0d0822c22a\") " pod="openshift-image-registry/node-ca-dg2kq" Apr 16 23:26:00.876926 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876892 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qzn8\" (UniqueName: \"kubernetes.io/projected/5c265860-ea8b-4315-acf6-bbb9ee728fe8-kube-api-access-5qzn8\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.877204 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876919 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7bc2b26c-3731-4818-b28c-4bb5ce01662b-host-slash\") pod \"iptables-alerter-wm22k\" (UID: \"7bc2b26c-3731-4818-b28c-4bb5ce01662b\") " pod="openshift-network-operator/iptables-alerter-wm22k" Apr 16 23:26:00.877204 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876958 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-kubernetes\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.877204 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876967 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-mvt7v\"" Apr 16 23:26:00.877204 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876991 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5c265860-ea8b-4315-acf6-bbb9ee728fe8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.877204 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877015 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 23:26:00.877204 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877043 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 23:26:00.877204 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.876993 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 23:26:00.877204 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877016 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/80d77b0e-6b2a-4741-be2b-8b95c72b915e-cni-binary-copy\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.877204 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877140 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-systemd\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.877204 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877170 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5c265860-ea8b-4315-acf6-bbb9ee728fe8-cnibin\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.877204 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877194 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gdn7\" (UniqueName: \"kubernetes.io/projected/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-kube-api-access-4gdn7\") pod \"network-metrics-daemon-zp26z\" (UID: \"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2\") " pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877225 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-os-release\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877245 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-hostroot\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877276 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-tuned\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877316 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs\") pod \"network-metrics-daemon-zp26z\" (UID: \"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2\") " pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877335 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877362 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5c265860-ea8b-4315-acf6-bbb9ee728fe8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877400 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-registration-dir\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877451 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft4l2\" (UniqueName: \"kubernetes.io/projected/331df142-711c-4252-8e63-0342d087918d-kube-api-access-ft4l2\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877488 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-etc-kubernetes\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877514 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqr9d\" (UniqueName: \"kubernetes.io/projected/d31edfcd-461b-4686-a705-ec3883ddb3f0-kube-api-access-rqr9d\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877538 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877562 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-host\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877584 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-run-k8s-cni-cncf-io\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877625 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877634 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7bc2b26c-3731-4818-b28c-4bb5ce01662b-iptables-alerter-script\") pod \"iptables-alerter-wm22k\" (UID: \"7bc2b26c-3731-4818-b28c-4bb5ce01662b\") " pod="openshift-network-operator/iptables-alerter-wm22k" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877673 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-modprobe-d\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.877715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877715 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-lib-modules\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877740 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r6pr\" (UniqueName: \"kubernetes.io/projected/5aea2741-baa5-486d-8a0f-9eef53a7f27a-kube-api-access-8r6pr\") pod \"node-resolver-4kt5z\" (UID: \"5aea2741-baa5-486d-8a0f-9eef53a7f27a\") " pod="openshift-dns/node-resolver-4kt5z" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877766 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-run-netns\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877781 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/80d77b0e-6b2a-4741-be2b-8b95c72b915e-multus-daemon-config\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877791 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877796 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-socket-dir\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877816 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-system-cni-dir\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877852 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-multus-cni-dir\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877899 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s6zb\" (UniqueName: \"kubernetes.io/projected/7bc2b26c-3731-4818-b28c-4bb5ce01662b-kube-api-access-6s6zb\") pod \"iptables-alerter-wm22k\" (UID: \"7bc2b26c-3731-4818-b28c-4bb5ce01662b\") " pod="openshift-network-operator/iptables-alerter-wm22k" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877926 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-sys\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.877983 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5c265860-ea8b-4315-acf6-bbb9ee728fe8-system-cni-dir\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878006 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rbrlz\"" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878014 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5c265860-ea8b-4315-acf6-bbb9ee728fe8-cni-binary-copy\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878036 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-var-lib-kubelet\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878057 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-run-multus-certs\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878098 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5aea2741-baa5-486d-8a0f-9eef53a7f27a-hosts-file\") pod \"node-resolver-4kt5z\" (UID: \"5aea2741-baa5-486d-8a0f-9eef53a7f27a\") " pod="openshift-dns/node-resolver-4kt5z" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878134 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-sys-fs\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878172 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-sysctl-conf\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.878366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878197 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwdqj\" (UniqueName: \"kubernetes.io/projected/fa4aee53-dd23-4d2a-9b51-9a0d0822c22a-kube-api-access-zwdqj\") pod \"node-ca-dg2kq\" (UID: \"fa4aee53-dd23-4d2a-9b51-9a0d0822c22a\") " pod="openshift-image-registry/node-ca-dg2kq" Apr 16 23:26:00.879009 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878223 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5c265860-ea8b-4315-acf6-bbb9ee728fe8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.879009 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878247 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjh8m\" (UniqueName: \"kubernetes.io/projected/80d77b0e-6b2a-4741-be2b-8b95c72b915e-kube-api-access-mjh8m\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.879009 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878270 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-device-dir\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.879009 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878294 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-var-lib-kubelet\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.879009 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878316 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fa4aee53-dd23-4d2a-9b51-9a0d0822c22a-serviceca\") pod \"node-ca-dg2kq\" (UID: \"fa4aee53-dd23-4d2a-9b51-9a0d0822c22a\") " pod="openshift-image-registry/node-ca-dg2kq" Apr 16 23:26:00.879009 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878366 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-var-lib-cni-bin\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.879009 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878382 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-etc-selinux\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.879009 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878398 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d31edfcd-461b-4686-a705-ec3883ddb3f0-tmp\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.879009 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.878440 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-cnibin\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.881243 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.881223 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 23:26:00.881431 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.881320 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-43.ec2.internal" Apr 16 23:26:00.881431 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.881417 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal"] Apr 16 23:26:00.883150 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.883129 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 23:26:00.888540 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.888521 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 23:26:00.888627 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.888548 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-131-43.ec2.internal"] Apr 16 23:26:00.898998 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.898964 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vdwx8" Apr 16 23:26:00.908573 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.908551 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vdwx8" Apr 16 23:26:00.972609 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.972437 2564 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 23:26:00.978655 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978632 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-run-openvswitch\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.978780 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978670 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5aea2741-baa5-486d-8a0f-9eef53a7f27a-tmp-dir\") pod \"node-resolver-4kt5z\" (UID: \"5aea2741-baa5-486d-8a0f-9eef53a7f27a\") " pod="openshift-dns/node-resolver-4kt5z" Apr 16 23:26:00.978780 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978690 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-multus-socket-dir-parent\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.978780 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978744 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-multus-conf-dir\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.978780 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978765 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-sysconfig\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.978991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978832 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fa4aee53-dd23-4d2a-9b51-9a0d0822c22a-host\") pod \"node-ca-dg2kq\" (UID: \"fa4aee53-dd23-4d2a-9b51-9a0d0822c22a\") " pod="openshift-image-registry/node-ca-dg2kq" Apr 16 23:26:00.978991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978832 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-multus-socket-dir-parent\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.978991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978837 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-sysconfig\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.978991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978859 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qzn8\" (UniqueName: \"kubernetes.io/projected/5c265860-ea8b-4315-acf6-bbb9ee728fe8-kube-api-access-5qzn8\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.978991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978845 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-multus-conf-dir\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.978991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978889 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7bc2b26c-3731-4818-b28c-4bb5ce01662b-host-slash\") pod \"iptables-alerter-wm22k\" (UID: \"7bc2b26c-3731-4818-b28c-4bb5ce01662b\") " pod="openshift-network-operator/iptables-alerter-wm22k" Apr 16 23:26:00.978991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978901 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fa4aee53-dd23-4d2a-9b51-9a0d0822c22a-host\") pod \"node-ca-dg2kq\" (UID: \"fa4aee53-dd23-4d2a-9b51-9a0d0822c22a\") " pod="openshift-image-registry/node-ca-dg2kq" Apr 16 23:26:00.978991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978919 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7bc2b26c-3731-4818-b28c-4bb5ce01662b-host-slash\") pod \"iptables-alerter-wm22k\" (UID: \"7bc2b26c-3731-4818-b28c-4bb5ce01662b\") " pod="openshift-network-operator/iptables-alerter-wm22k" Apr 16 23:26:00.978991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978935 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-node-log\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.978991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978969 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4457788c-bfe7-45d0-9674-8966cbeef7a6-ovnkube-config\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.978991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.978993 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4457788c-bfe7-45d0-9674-8966cbeef7a6-ovn-node-metrics-cert\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979002 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5aea2741-baa5-486d-8a0f-9eef53a7f27a-tmp-dir\") pod \"node-resolver-4kt5z\" (UID: \"5aea2741-baa5-486d-8a0f-9eef53a7f27a\") " pod="openshift-dns/node-resolver-4kt5z" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979020 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-kubernetes\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979036 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5c265860-ea8b-4315-acf6-bbb9ee728fe8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979061 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/80d77b0e-6b2a-4741-be2b-8b95c72b915e-cni-binary-copy\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979096 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4457788c-bfe7-45d0-9674-8966cbeef7a6-env-overrides\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979126 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv2m8\" (UniqueName: \"kubernetes.io/projected/4457788c-bfe7-45d0-9674-8966cbeef7a6-kube-api-access-jv2m8\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979157 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-systemd\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979098 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-kubernetes\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979183 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5c265860-ea8b-4315-acf6-bbb9ee728fe8-cnibin\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979209 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4457788c-bfe7-45d0-9674-8966cbeef7a6-ovnkube-script-lib\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979239 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c7fc9336-0583-4750-b4d1-143cb0e8e3bf-konnectivity-ca\") pod \"konnectivity-agent-wmsm7\" (UID: \"c7fc9336-0583-4750-b4d1-143cb0e8e3bf\") " pod="kube-system/konnectivity-agent-wmsm7" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979258 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-systemd\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979267 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gdn7\" (UniqueName: \"kubernetes.io/projected/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-kube-api-access-4gdn7\") pod \"network-metrics-daemon-zp26z\" (UID: \"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2\") " pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979296 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-os-release\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979319 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-hostroot\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979320 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5c265860-ea8b-4315-acf6-bbb9ee728fe8-cnibin\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.979478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979376 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-hostroot\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979488 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-os-release\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979519 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-systemd-units\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979542 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-run-netns\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979557 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-var-lib-openvswitch\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979579 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-tuned\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979600 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs\") pod \"network-metrics-daemon-zp26z\" (UID: \"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2\") " pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979619 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5c265860-ea8b-4315-acf6-bbb9ee728fe8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979628 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5c265860-ea8b-4315-acf6-bbb9ee728fe8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979651 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/80d77b0e-6b2a-4741-be2b-8b95c72b915e-cni-binary-copy\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979635 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-etc-openvswitch\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979718 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-cni-bin\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:00.979736 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979746 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979777 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-registration-dir\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:00.979792 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs podName:e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:01.47976917 +0000 UTC m=+2.095800651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs") pod "network-metrics-daemon-zp26z" (UID: "e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:00.980290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979825 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-registration-dir\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979850 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ft4l2\" (UniqueName: \"kubernetes.io/projected/331df142-711c-4252-8e63-0342d087918d-kube-api-access-ft4l2\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979882 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-etc-kubernetes\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979909 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqr9d\" (UniqueName: \"kubernetes.io/projected/d31edfcd-461b-4686-a705-ec3883ddb3f0-kube-api-access-rqr9d\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979900 2564 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979936 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979961 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-host\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.979959 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-etc-kubernetes\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980001 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-run-k8s-cni-cncf-io\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980031 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-host\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980037 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7bc2b26c-3731-4818-b28c-4bb5ce01662b-iptables-alerter-script\") pod \"iptables-alerter-wm22k\" (UID: \"7bc2b26c-3731-4818-b28c-4bb5ce01662b\") " pod="openshift-network-operator/iptables-alerter-wm22k" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980084 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-modprobe-d\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980095 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-run-k8s-cni-cncf-io\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980117 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-lib-modules\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980124 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980142 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8r6pr\" (UniqueName: \"kubernetes.io/projected/5aea2741-baa5-486d-8a0f-9eef53a7f27a-kube-api-access-8r6pr\") pod \"node-resolver-4kt5z\" (UID: \"5aea2741-baa5-486d-8a0f-9eef53a7f27a\") " pod="openshift-dns/node-resolver-4kt5z" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980172 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-run-netns\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980193 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5c265860-ea8b-4315-acf6-bbb9ee728fe8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.981743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980197 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-modprobe-d\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980221 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/80d77b0e-6b2a-4741-be2b-8b95c72b915e-multus-daemon-config\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980218 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-lib-modules\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980271 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2frd4\" (UniqueName: \"kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4\") pod \"network-check-target-rl6cs\" (UID: \"a35b971f-784e-46e9-b251-dbb9a720c52f\") " pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980267 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-run-netns\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980299 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980337 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-socket-dir\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980369 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-system-cni-dir\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980398 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-multus-cni-dir\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980417 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6s6zb\" (UniqueName: \"kubernetes.io/projected/7bc2b26c-3731-4818-b28c-4bb5ce01662b-kube-api-access-6s6zb\") pod \"iptables-alerter-wm22k\" (UID: \"7bc2b26c-3731-4818-b28c-4bb5ce01662b\") " pod="openshift-network-operator/iptables-alerter-wm22k" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980435 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-slash\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980460 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-socket-dir\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980462 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-run-systemd\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980493 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-multus-cni-dir\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980506 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-log-socket\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980532 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-sys\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980542 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-system-cni-dir\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.982613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980557 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5c265860-ea8b-4315-acf6-bbb9ee728fe8-system-cni-dir\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980587 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5c265860-ea8b-4315-acf6-bbb9ee728fe8-cni-binary-copy\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980600 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-sys\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980611 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-var-lib-kubelet\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980632 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7bc2b26c-3731-4818-b28c-4bb5ce01662b-iptables-alerter-script\") pod \"iptables-alerter-wm22k\" (UID: \"7bc2b26c-3731-4818-b28c-4bb5ce01662b\") " pod="openshift-network-operator/iptables-alerter-wm22k" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980643 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-run-multus-certs\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980637 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5c265860-ea8b-4315-acf6-bbb9ee728fe8-system-cni-dir\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980667 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5aea2741-baa5-486d-8a0f-9eef53a7f27a-hosts-file\") pod \"node-resolver-4kt5z\" (UID: \"5aea2741-baa5-486d-8a0f-9eef53a7f27a\") " pod="openshift-dns/node-resolver-4kt5z" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980667 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-var-lib-kubelet\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980716 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-run-multus-certs\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980737 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5aea2741-baa5-486d-8a0f-9eef53a7f27a-hosts-file\") pod \"node-resolver-4kt5z\" (UID: \"5aea2741-baa5-486d-8a0f-9eef53a7f27a\") " pod="openshift-dns/node-resolver-4kt5z" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980771 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-sys-fs\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980777 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/80d77b0e-6b2a-4741-be2b-8b95c72b915e-multus-daemon-config\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980801 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-sysctl-conf\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980828 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwdqj\" (UniqueName: \"kubernetes.io/projected/fa4aee53-dd23-4d2a-9b51-9a0d0822c22a-kube-api-access-zwdqj\") pod \"node-ca-dg2kq\" (UID: \"fa4aee53-dd23-4d2a-9b51-9a0d0822c22a\") " pod="openshift-image-registry/node-ca-dg2kq" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980851 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5c265860-ea8b-4315-acf6-bbb9ee728fe8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980871 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjh8m\" (UniqueName: \"kubernetes.io/projected/80d77b0e-6b2a-4741-be2b-8b95c72b915e-kube-api-access-mjh8m\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.983361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980836 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-sys-fs\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980898 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-cni-netd\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980925 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-device-dir\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980960 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-sysctl-conf\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.980973 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-var-lib-kubelet\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981000 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5c265860-ea8b-4315-acf6-bbb9ee728fe8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981002 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-kubelet\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981040 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c7fc9336-0583-4750-b4d1-143cb0e8e3bf-agent-certs\") pod \"konnectivity-agent-wmsm7\" (UID: \"c7fc9336-0583-4750-b4d1-143cb0e8e3bf\") " pod="kube-system/konnectivity-agent-wmsm7" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981047 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-device-dir\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981062 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-var-lib-kubelet\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981067 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5c265860-ea8b-4315-acf6-bbb9ee728fe8-cni-binary-copy\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981064 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fa4aee53-dd23-4d2a-9b51-9a0d0822c22a-serviceca\") pod \"node-ca-dg2kq\" (UID: \"fa4aee53-dd23-4d2a-9b51-9a0d0822c22a\") " pod="openshift-image-registry/node-ca-dg2kq" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981111 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-var-lib-cni-bin\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981128 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-run-ovn\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981144 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-etc-selinux\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981165 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d31edfcd-461b-4686-a705-ec3883ddb3f0-tmp\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981179 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-cnibin\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.983866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981192 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-var-lib-cni-bin\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.984489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981200 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-sysctl-d\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.984489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981223 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-run\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.984489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981244 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/331df142-711c-4252-8e63-0342d087918d-etc-selinux\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.984489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981248 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5c265860-ea8b-4315-acf6-bbb9ee728fe8-os-release\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.984489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981248 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-cnibin\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.984489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981282 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-var-lib-cni-multus\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.984489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981310 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-run\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.984489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981338 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5c265860-ea8b-4315-acf6-bbb9ee728fe8-os-release\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.984489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981341 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-sysctl-d\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.984489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981352 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/80d77b0e-6b2a-4741-be2b-8b95c72b915e-host-var-lib-cni-multus\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.984489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.981406 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fa4aee53-dd23-4d2a-9b51-9a0d0822c22a-serviceca\") pod \"node-ca-dg2kq\" (UID: \"fa4aee53-dd23-4d2a-9b51-9a0d0822c22a\") " pod="openshift-image-registry/node-ca-dg2kq" Apr 16 23:26:00.984489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.982669 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d31edfcd-461b-4686-a705-ec3883ddb3f0-etc-tuned\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.984489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.983350 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d31edfcd-461b-4686-a705-ec3883ddb3f0-tmp\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.986050 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.986026 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qzn8\" (UniqueName: \"kubernetes.io/projected/5c265860-ea8b-4315-acf6-bbb9ee728fe8-kube-api-access-5qzn8\") pod \"multus-additional-cni-plugins-wnp9b\" (UID: \"5c265860-ea8b-4315-acf6-bbb9ee728fe8\") " pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:00.986710 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.986680 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gdn7\" (UniqueName: \"kubernetes.io/projected/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-kube-api-access-4gdn7\") pod \"network-metrics-daemon-zp26z\" (UID: \"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2\") " pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:00.988321 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:26:00.988293 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c6c197c2824d262900a926e3ff6a96c.slice/crio-8cec59c17b79deae7b2f3124739bc08e6a961b0e172c1dd8c30dda4324e10ab8 WatchSource:0}: Error finding container 8cec59c17b79deae7b2f3124739bc08e6a961b0e172c1dd8c30dda4324e10ab8: Status 404 returned error can't find the container with id 8cec59c17b79deae7b2f3124739bc08e6a961b0e172c1dd8c30dda4324e10ab8 Apr 16 23:26:00.988957 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:26:00.988932 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24f90fad8c4786333edb9df978f6d7c6.slice/crio-87b4d3eb9329c28b570e7e1c29ab4d2ae42f1048ee3f0ea8ea532836df7ddb7e WatchSource:0}: Error finding container 87b4d3eb9329c28b570e7e1c29ab4d2ae42f1048ee3f0ea8ea532836df7ddb7e: Status 404 returned error can't find the container with id 87b4d3eb9329c28b570e7e1c29ab4d2ae42f1048ee3f0ea8ea532836df7ddb7e Apr 16 23:26:00.993359 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.993286 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft4l2\" (UniqueName: \"kubernetes.io/projected/331df142-711c-4252-8e63-0342d087918d-kube-api-access-ft4l2\") pod \"aws-ebs-csi-driver-node-bkl7b\" (UID: \"331df142-711c-4252-8e63-0342d087918d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:00.993880 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.993860 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s6zb\" (UniqueName: \"kubernetes.io/projected/7bc2b26c-3731-4818-b28c-4bb5ce01662b-kube-api-access-6s6zb\") pod \"iptables-alerter-wm22k\" (UID: \"7bc2b26c-3731-4818-b28c-4bb5ce01662b\") " pod="openshift-network-operator/iptables-alerter-wm22k" Apr 16 23:26:00.993976 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.993862 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r6pr\" (UniqueName: \"kubernetes.io/projected/5aea2741-baa5-486d-8a0f-9eef53a7f27a-kube-api-access-8r6pr\") pod \"node-resolver-4kt5z\" (UID: \"5aea2741-baa5-486d-8a0f-9eef53a7f27a\") " pod="openshift-dns/node-resolver-4kt5z" Apr 16 23:26:00.994232 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.994209 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqr9d\" (UniqueName: \"kubernetes.io/projected/d31edfcd-461b-4686-a705-ec3883ddb3f0-kube-api-access-rqr9d\") pod \"tuned-qh8gv\" (UID: \"d31edfcd-461b-4686-a705-ec3883ddb3f0\") " pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:00.994298 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.994221 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjh8m\" (UniqueName: \"kubernetes.io/projected/80d77b0e-6b2a-4741-be2b-8b95c72b915e-kube-api-access-mjh8m\") pod \"multus-mwc9h\" (UID: \"80d77b0e-6b2a-4741-be2b-8b95c72b915e\") " pod="openshift-multus/multus-mwc9h" Apr 16 23:26:00.994965 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.994945 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwdqj\" (UniqueName: \"kubernetes.io/projected/fa4aee53-dd23-4d2a-9b51-9a0d0822c22a-kube-api-access-zwdqj\") pod \"node-ca-dg2kq\" (UID: \"fa4aee53-dd23-4d2a-9b51-9a0d0822c22a\") " pod="openshift-image-registry/node-ca-dg2kq" Apr 16 23:26:00.995194 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:00.995179 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:26:01.000118 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.000099 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" Apr 16 23:26:01.007115 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:26:01.007095 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd31edfcd_461b_4686_a705_ec3883ddb3f0.slice/crio-42b9e4eac355473b431dc0baa94e91c51d886f8bd597484c5f9495ec26fe2b92 WatchSource:0}: Error finding container 42b9e4eac355473b431dc0baa94e91c51d886f8bd597484c5f9495ec26fe2b92: Status 404 returned error can't find the container with id 42b9e4eac355473b431dc0baa94e91c51d886f8bd597484c5f9495ec26fe2b92 Apr 16 23:26:01.028265 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.028240 2564 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:26:01.082307 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082271 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2frd4\" (UniqueName: \"kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4\") pod \"network-check-target-rl6cs\" (UID: \"a35b971f-784e-46e9-b251-dbb9a720c52f\") " pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:01.082307 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082307 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082491 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082331 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-slash\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082491 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082358 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-run-systemd\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082491 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082390 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082491 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082412 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-slash\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082491 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082399 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-run-systemd\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082491 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082454 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-log-socket\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082491 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082485 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-cni-netd\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082501 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-kubelet\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082517 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c7fc9336-0583-4750-b4d1-143cb0e8e3bf-agent-certs\") pod \"konnectivity-agent-wmsm7\" (UID: \"c7fc9336-0583-4750-b4d1-143cb0e8e3bf\") " pod="kube-system/konnectivity-agent-wmsm7" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082522 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-log-socket\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082534 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-run-ovn\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082549 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-kubelet\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082562 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-run-openvswitch\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082580 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-run-ovn\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082593 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-node-log\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082596 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-cni-netd\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082616 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4457788c-bfe7-45d0-9674-8966cbeef7a6-ovnkube-config\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082629 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-node-log\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082641 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4457788c-bfe7-45d0-9674-8966cbeef7a6-ovn-node-metrics-cert\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082625 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-run-openvswitch\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082666 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4457788c-bfe7-45d0-9674-8966cbeef7a6-env-overrides\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082693 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jv2m8\" (UniqueName: \"kubernetes.io/projected/4457788c-bfe7-45d0-9674-8966cbeef7a6-kube-api-access-jv2m8\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082739 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4457788c-bfe7-45d0-9674-8966cbeef7a6-ovnkube-script-lib\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082762 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c7fc9336-0583-4750-b4d1-143cb0e8e3bf-konnectivity-ca\") pod \"konnectivity-agent-wmsm7\" (UID: \"c7fc9336-0583-4750-b4d1-143cb0e8e3bf\") " pod="kube-system/konnectivity-agent-wmsm7" Apr 16 23:26:01.082796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082788 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-systemd-units\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.083539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082814 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-run-netns\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.083539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082840 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-var-lib-openvswitch\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.083539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082880 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-etc-openvswitch\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.083539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082903 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-cni-bin\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.083539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082930 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.083539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.082979 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-run-netns\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.083539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.083036 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-var-lib-openvswitch\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.083539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.083052 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-systemd-units\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.083539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.083084 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-etc-openvswitch\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.083539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.083116 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-cni-bin\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.083539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.083124 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457788c-bfe7-45d0-9674-8966cbeef7a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.083539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.083237 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4457788c-bfe7-45d0-9674-8966cbeef7a6-ovnkube-config\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.083539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.083256 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4457788c-bfe7-45d0-9674-8966cbeef7a6-env-overrides\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.083539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.083430 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c7fc9336-0583-4750-b4d1-143cb0e8e3bf-konnectivity-ca\") pod \"konnectivity-agent-wmsm7\" (UID: \"c7fc9336-0583-4750-b4d1-143cb0e8e3bf\") " pod="kube-system/konnectivity-agent-wmsm7" Apr 16 23:26:01.083539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.083466 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4457788c-bfe7-45d0-9674-8966cbeef7a6-ovnkube-script-lib\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.085469 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.085449 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4457788c-bfe7-45d0-9674-8966cbeef7a6-ovn-node-metrics-cert\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.085585 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.085570 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c7fc9336-0583-4750-b4d1-143cb0e8e3bf-agent-certs\") pod \"konnectivity-agent-wmsm7\" (UID: \"c7fc9336-0583-4750-b4d1-143cb0e8e3bf\") " pod="kube-system/konnectivity-agent-wmsm7" Apr 16 23:26:01.087784 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:01.087767 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:26:01.087852 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:01.087790 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:26:01.087852 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:01.087804 2564 projected.go:194] Error preparing data for projected volume kube-api-access-2frd4 for pod openshift-network-diagnostics/network-check-target-rl6cs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:01.087937 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:01.087872 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4 podName:a35b971f-784e-46e9-b251-dbb9a720c52f nodeName:}" failed. No retries permitted until 2026-04-16 23:26:01.587854765 +0000 UTC m=+2.203886236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2frd4" (UniqueName: "kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4") pod "network-check-target-rl6cs" (UID: "a35b971f-784e-46e9-b251-dbb9a720c52f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:01.089611 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.089595 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv2m8\" (UniqueName: \"kubernetes.io/projected/4457788c-bfe7-45d0-9674-8966cbeef7a6-kube-api-access-jv2m8\") pod \"ovnkube-node-md28x\" (UID: \"4457788c-bfe7-45d0-9674-8966cbeef7a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.199185 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.199077 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" Apr 16 23:26:01.206657 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:26:01.206625 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod331df142_711c_4252_8e63_0342d087918d.slice/crio-16341d695d498d1d929288753597446b4eeae9bf15dfc222b41c61951dd91b49 WatchSource:0}: Error finding container 16341d695d498d1d929288753597446b4eeae9bf15dfc222b41c61951dd91b49: Status 404 returned error can't find the container with id 16341d695d498d1d929288753597446b4eeae9bf15dfc222b41c61951dd91b49 Apr 16 23:26:01.218282 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.218259 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4kt5z" Apr 16 23:26:01.220831 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.220812 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dg2kq" Apr 16 23:26:01.225871 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:26:01.225839 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aea2741_baa5_486d_8a0f_9eef53a7f27a.slice/crio-4ad004a1660ed300d7e1f4f7bec738f7878619122718f0365899b213edf0f0b6 WatchSource:0}: Error finding container 4ad004a1660ed300d7e1f4f7bec738f7878619122718f0365899b213edf0f0b6: Status 404 returned error can't find the container with id 4ad004a1660ed300d7e1f4f7bec738f7878619122718f0365899b213edf0f0b6 Apr 16 23:26:01.229589 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:26:01.229566 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa4aee53_dd23_4d2a_9b51_9a0d0822c22a.slice/crio-33b685e696a43518a56cbb3f2edd5f81cd5dd9cb85a062e232a7a3dac2f36c17 WatchSource:0}: Error finding container 33b685e696a43518a56cbb3f2edd5f81cd5dd9cb85a062e232a7a3dac2f36c17: Status 404 returned error can't find the container with id 33b685e696a43518a56cbb3f2edd5f81cd5dd9cb85a062e232a7a3dac2f36c17 Apr 16 23:26:01.237504 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.237484 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wnp9b" Apr 16 23:26:01.243083 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:26:01.243060 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c265860_ea8b_4315_acf6_bbb9ee728fe8.slice/crio-6fec0d910f4c9e7f6a184b5fd3bdeed36c65159bf8c418822f0ce8b9c227e3b8 WatchSource:0}: Error finding container 6fec0d910f4c9e7f6a184b5fd3bdeed36c65159bf8c418822f0ce8b9c227e3b8: Status 404 returned error can't find the container with id 6fec0d910f4c9e7f6a184b5fd3bdeed36c65159bf8c418822f0ce8b9c227e3b8 Apr 16 23:26:01.251287 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.251264 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wm22k" Apr 16 23:26:01.257926 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:26:01.257901 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bc2b26c_3731_4818_b28c_4bb5ce01662b.slice/crio-c7835e22de8a61ce854fb891324d8a605d2eb92e0df5b2ada53af8d90dfe7109 WatchSource:0}: Error finding container c7835e22de8a61ce854fb891324d8a605d2eb92e0df5b2ada53af8d90dfe7109: Status 404 returned error can't find the container with id c7835e22de8a61ce854fb891324d8a605d2eb92e0df5b2ada53af8d90dfe7109 Apr 16 23:26:01.269687 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.269664 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mwc9h" Apr 16 23:26:01.275901 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:26:01.275876 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80d77b0e_6b2a_4741_be2b_8b95c72b915e.slice/crio-6adf9de440cc8fc4679b29d68ed9a882052f9a81d15238fc1d7311b316ff5713 WatchSource:0}: Error finding container 6adf9de440cc8fc4679b29d68ed9a882052f9a81d15238fc1d7311b316ff5713: Status 404 returned error can't find the container with id 6adf9de440cc8fc4679b29d68ed9a882052f9a81d15238fc1d7311b316ff5713 Apr 16 23:26:01.307166 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.307133 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:01.311778 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.311758 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wmsm7" Apr 16 23:26:01.313363 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:26:01.313338 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4457788c_bfe7_45d0_9674_8966cbeef7a6.slice/crio-08835888108b95e132c2418ec6663d9dbc658cdc1fe1247b3d2dab9f6d91bf67 WatchSource:0}: Error finding container 08835888108b95e132c2418ec6663d9dbc658cdc1fe1247b3d2dab9f6d91bf67: Status 404 returned error can't find the container with id 08835888108b95e132c2418ec6663d9dbc658cdc1fe1247b3d2dab9f6d91bf67 Apr 16 23:26:01.319026 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:26:01.319004 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7fc9336_0583_4750_b4d1_143cb0e8e3bf.slice/crio-c24a8c4f8b690159da13198f8ff51090bd36c2b976b31f09b2846106097de3ef WatchSource:0}: Error finding container c24a8c4f8b690159da13198f8ff51090bd36c2b976b31f09b2846106097de3ef: Status 404 returned error can't find the container with id c24a8c4f8b690159da13198f8ff51090bd36c2b976b31f09b2846106097de3ef Apr 16 23:26:01.486786 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.486753 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs\") pod \"network-metrics-daemon-zp26z\" (UID: \"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2\") " pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:01.486954 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:01.486914 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:01.487032 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:01.486975 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs podName:e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:02.486957094 +0000 UTC m=+3.102988577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs") pod "network-metrics-daemon-zp26z" (UID: "e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:01.588068 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.588023 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2frd4\" (UniqueName: \"kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4\") pod \"network-check-target-rl6cs\" (UID: \"a35b971f-784e-46e9-b251-dbb9a720c52f\") " pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:01.588228 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:01.588178 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:26:01.588228 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:01.588203 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:26:01.588228 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:01.588217 2564 projected.go:194] Error preparing data for projected volume kube-api-access-2frd4 for pod openshift-network-diagnostics/network-check-target-rl6cs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:01.588496 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:01.588279 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4 podName:a35b971f-784e-46e9-b251-dbb9a720c52f nodeName:}" failed. No retries permitted until 2026-04-16 23:26:02.588260857 +0000 UTC m=+3.204292328 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2frd4" (UniqueName: "kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4") pod "network-check-target-rl6cs" (UID: "a35b971f-784e-46e9-b251-dbb9a720c52f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:01.698930 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.698901 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:26:01.909942 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.909784 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 23:21:00 +0000 UTC" deadline="2027-10-14 21:09:23.151448516 +0000 UTC" Apr 16 23:26:01.909942 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.909824 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13101h43m21.241628633s" Apr 16 23:26:01.944241 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.943993 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:26:01.980944 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.980184 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:01.980944 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:01.980322 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:01.980944 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.980777 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:01.980944 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:01.980865 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:01.988687 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.988634 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" event={"ID":"4457788c-bfe7-45d0-9674-8966cbeef7a6","Type":"ContainerStarted","Data":"08835888108b95e132c2418ec6663d9dbc658cdc1fe1247b3d2dab9f6d91bf67"} Apr 16 23:26:01.997715 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:01.997669 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dg2kq" event={"ID":"fa4aee53-dd23-4d2a-9b51-9a0d0822c22a","Type":"ContainerStarted","Data":"33b685e696a43518a56cbb3f2edd5f81cd5dd9cb85a062e232a7a3dac2f36c17"} Apr 16 23:26:02.003088 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:02.002984 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4kt5z" event={"ID":"5aea2741-baa5-486d-8a0f-9eef53a7f27a","Type":"ContainerStarted","Data":"4ad004a1660ed300d7e1f4f7bec738f7878619122718f0365899b213edf0f0b6"} Apr 16 23:26:02.012908 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:02.012826 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal" event={"ID":"24f90fad8c4786333edb9df978f6d7c6","Type":"ContainerStarted","Data":"87b4d3eb9329c28b570e7e1c29ab4d2ae42f1048ee3f0ea8ea532836df7ddb7e"} Apr 16 23:26:02.021413 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:02.021353 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wmsm7" event={"ID":"c7fc9336-0583-4750-b4d1-143cb0e8e3bf","Type":"ContainerStarted","Data":"c24a8c4f8b690159da13198f8ff51090bd36c2b976b31f09b2846106097de3ef"} Apr 16 23:26:02.033667 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:02.033611 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mwc9h" event={"ID":"80d77b0e-6b2a-4741-be2b-8b95c72b915e","Type":"ContainerStarted","Data":"6adf9de440cc8fc4679b29d68ed9a882052f9a81d15238fc1d7311b316ff5713"} Apr 16 23:26:02.041952 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:02.041873 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wm22k" event={"ID":"7bc2b26c-3731-4818-b28c-4bb5ce01662b","Type":"ContainerStarted","Data":"c7835e22de8a61ce854fb891324d8a605d2eb92e0df5b2ada53af8d90dfe7109"} Apr 16 23:26:02.060598 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:02.060518 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wnp9b" event={"ID":"5c265860-ea8b-4315-acf6-bbb9ee728fe8","Type":"ContainerStarted","Data":"6fec0d910f4c9e7f6a184b5fd3bdeed36c65159bf8c418822f0ce8b9c227e3b8"} Apr 16 23:26:02.081638 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:02.081598 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" event={"ID":"331df142-711c-4252-8e63-0342d087918d","Type":"ContainerStarted","Data":"16341d695d498d1d929288753597446b4eeae9bf15dfc222b41c61951dd91b49"} Apr 16 23:26:02.091270 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:02.091227 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" event={"ID":"d31edfcd-461b-4686-a705-ec3883ddb3f0","Type":"ContainerStarted","Data":"42b9e4eac355473b431dc0baa94e91c51d886f8bd597484c5f9495ec26fe2b92"} Apr 16 23:26:02.109046 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:02.109003 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-43.ec2.internal" event={"ID":"8c6c197c2824d262900a926e3ff6a96c","Type":"ContainerStarted","Data":"8cec59c17b79deae7b2f3124739bc08e6a961b0e172c1dd8c30dda4324e10ab8"} Apr 16 23:26:02.492646 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:02.492602 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs\") pod \"network-metrics-daemon-zp26z\" (UID: \"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2\") " pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:02.492851 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:02.492812 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:02.492909 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:02.492881 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs podName:e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:04.492859959 +0000 UTC m=+5.108891419 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs") pod "network-metrics-daemon-zp26z" (UID: "e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:02.593359 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:02.593313 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2frd4\" (UniqueName: \"kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4\") pod \"network-check-target-rl6cs\" (UID: \"a35b971f-784e-46e9-b251-dbb9a720c52f\") " pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:02.593541 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:02.593527 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:26:02.593599 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:02.593548 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:26:02.593599 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:02.593561 2564 projected.go:194] Error preparing data for projected volume kube-api-access-2frd4 for pod openshift-network-diagnostics/network-check-target-rl6cs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:02.593692 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:02.593627 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4 podName:a35b971f-784e-46e9-b251-dbb9a720c52f nodeName:}" failed. No retries permitted until 2026-04-16 23:26:04.593608338 +0000 UTC m=+5.209639810 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2frd4" (UniqueName: "kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4") pod "network-check-target-rl6cs" (UID: "a35b971f-784e-46e9-b251-dbb9a720c52f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:02.910418 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:02.910293 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 23:21:00 +0000 UTC" deadline="2027-09-16 08:52:15.615929736 +0000 UTC" Apr 16 23:26:02.910418 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:02.910331 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12417h26m12.705602095s" Apr 16 23:26:03.980530 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:03.980497 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:03.980997 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:03.980634 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:03.981056 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:03.981039 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:03.981143 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:03.981121 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:04.511893 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:04.511851 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs\") pod \"network-metrics-daemon-zp26z\" (UID: \"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2\") " pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:04.512078 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:04.512001 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:04.512078 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:04.512075 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs podName:e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:08.512046415 +0000 UTC m=+9.128077877 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs") pod "network-metrics-daemon-zp26z" (UID: "e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:04.612644 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:04.612459 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2frd4\" (UniqueName: \"kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4\") pod \"network-check-target-rl6cs\" (UID: \"a35b971f-784e-46e9-b251-dbb9a720c52f\") " pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:04.612850 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:04.612665 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:26:04.612850 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:04.612687 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:26:04.612850 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:04.612714 2564 projected.go:194] Error preparing data for projected volume kube-api-access-2frd4 for pod openshift-network-diagnostics/network-check-target-rl6cs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:04.612850 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:04.612778 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4 podName:a35b971f-784e-46e9-b251-dbb9a720c52f nodeName:}" failed. No retries permitted until 2026-04-16 23:26:08.612758919 +0000 UTC m=+9.228790389 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2frd4" (UniqueName: "kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4") pod "network-check-target-rl6cs" (UID: "a35b971f-784e-46e9-b251-dbb9a720c52f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:05.978761 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:05.978728 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:05.979218 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:05.978879 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:05.979490 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:05.979277 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:05.979490 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:05.979391 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:07.978206 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:07.978168 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:07.978206 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:07.978196 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:07.978763 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:07.978325 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:07.978763 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:07.978430 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:08.545406 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:08.545373 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs\") pod \"network-metrics-daemon-zp26z\" (UID: \"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2\") " pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:08.545722 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:08.545543 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:08.545722 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:08.545616 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs podName:e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:16.545598058 +0000 UTC m=+17.161629515 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs") pod "network-metrics-daemon-zp26z" (UID: "e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:08.646396 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:08.646339 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2frd4\" (UniqueName: \"kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4\") pod \"network-check-target-rl6cs\" (UID: \"a35b971f-784e-46e9-b251-dbb9a720c52f\") " pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:08.646589 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:08.646490 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:26:08.646589 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:08.646509 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:26:08.646589 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:08.646522 2564 projected.go:194] Error preparing data for projected volume kube-api-access-2frd4 for pod openshift-network-diagnostics/network-check-target-rl6cs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:08.646589 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:08.646588 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4 podName:a35b971f-784e-46e9-b251-dbb9a720c52f nodeName:}" failed. No retries permitted until 2026-04-16 23:26:16.646569165 +0000 UTC m=+17.262600641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2frd4" (UniqueName: "kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4") pod "network-check-target-rl6cs" (UID: "a35b971f-784e-46e9-b251-dbb9a720c52f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:09.979434 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:09.979393 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:09.979905 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:09.979539 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:09.979905 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:09.979744 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:09.979905 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:09.979851 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:11.980678 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:11.980648 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:11.981099 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:11.980654 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:11.981099 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:11.980778 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:11.981099 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:11.980862 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:13.980533 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:13.980497 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:13.981001 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:13.980502 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:13.981001 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:13.980612 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:13.981001 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:13.980753 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:15.978070 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:15.978032 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:15.978070 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:15.978064 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:15.978577 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:15.978209 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:15.978577 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:15.978356 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:16.604966 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:16.604925 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs\") pod \"network-metrics-daemon-zp26z\" (UID: \"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2\") " pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:16.605165 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:16.605037 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:16.605165 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:16.605122 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs podName:e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:32.605101207 +0000 UTC m=+33.221132676 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs") pod "network-metrics-daemon-zp26z" (UID: "e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:16.706070 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:16.706033 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2frd4\" (UniqueName: \"kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4\") pod \"network-check-target-rl6cs\" (UID: \"a35b971f-784e-46e9-b251-dbb9a720c52f\") " pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:16.706250 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:16.706171 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:26:16.706250 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:16.706192 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:26:16.706250 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:16.706207 2564 projected.go:194] Error preparing data for projected volume kube-api-access-2frd4 for pod openshift-network-diagnostics/network-check-target-rl6cs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:16.706383 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:16.706272 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4 podName:a35b971f-784e-46e9-b251-dbb9a720c52f nodeName:}" failed. No retries permitted until 2026-04-16 23:26:32.706254874 +0000 UTC m=+33.322286345 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2frd4" (UniqueName: "kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4") pod "network-check-target-rl6cs" (UID: "a35b971f-784e-46e9-b251-dbb9a720c52f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:17.977665 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:17.977625 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:17.978218 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:17.977672 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:17.978218 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:17.977790 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:17.978218 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:17.977878 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:19.979306 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:19.979079 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:19.980007 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:19.979145 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:19.980007 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:19.979414 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:19.980007 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:19.979559 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:20.156291 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.156254 2564 generic.go:358] "Generic (PLEG): container finished" podID="5c265860-ea8b-4315-acf6-bbb9ee728fe8" containerID="b01bb1c76f7716373bbe690b0645334e6c49137c69716bf89b8fbfb0f23c02f0" exitCode=0 Apr 16 23:26:20.156461 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.156328 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wnp9b" event={"ID":"5c265860-ea8b-4315-acf6-bbb9ee728fe8","Type":"ContainerDied","Data":"b01bb1c76f7716373bbe690b0645334e6c49137c69716bf89b8fbfb0f23c02f0"} Apr 16 23:26:20.157755 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.157672 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" event={"ID":"331df142-711c-4252-8e63-0342d087918d","Type":"ContainerStarted","Data":"f24e58ae8f734ab2b788429746fc66fa84be1c23b306e01408023cb0556b35e1"} Apr 16 23:26:20.158959 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.158931 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" event={"ID":"d31edfcd-461b-4686-a705-ec3883ddb3f0","Type":"ContainerStarted","Data":"7bd1d3502665c21b154797c4047b45747a478f23313a40c0d4ba271ec8240543"} Apr 16 23:26:20.160273 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.160249 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-43.ec2.internal" event={"ID":"8c6c197c2824d262900a926e3ff6a96c","Type":"ContainerStarted","Data":"e7e51998811a0dbdde742a60e7e5e690181d4921ac12a5cb65802005d1f8a074"} Apr 16 23:26:20.162848 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.162831 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:26:20.163150 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.163134 2564 generic.go:358] "Generic (PLEG): container finished" podID="4457788c-bfe7-45d0-9674-8966cbeef7a6" containerID="e4e7f05a50c1a4489c1e08cb9d6a65a7e522e1632d66efc78b4f52f5194e1d28" exitCode=1 Apr 16 23:26:20.163219 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.163191 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" event={"ID":"4457788c-bfe7-45d0-9674-8966cbeef7a6","Type":"ContainerStarted","Data":"74807f59561178634d60fa274f016d365aadc79f0d44c37f4ca98274f2a13261"} Apr 16 23:26:20.163219 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.163215 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" event={"ID":"4457788c-bfe7-45d0-9674-8966cbeef7a6","Type":"ContainerStarted","Data":"3604210db516421a9c8b9f54305a1969b215de8d97bad333745f71197e6d906f"} Apr 16 23:26:20.163301 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.163225 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" event={"ID":"4457788c-bfe7-45d0-9674-8966cbeef7a6","Type":"ContainerStarted","Data":"25f436f27feb365057945160f5266ed7c9bc70073d6a3229351fb79c7e2aacd2"} Apr 16 23:26:20.163301 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.163234 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" event={"ID":"4457788c-bfe7-45d0-9674-8966cbeef7a6","Type":"ContainerStarted","Data":"26c740de726db2f11e8d06348e00cae8a09f2f506d1b83c90450a4032b6a20c0"} Apr 16 23:26:20.163301 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.163242 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" event={"ID":"4457788c-bfe7-45d0-9674-8966cbeef7a6","Type":"ContainerDied","Data":"e4e7f05a50c1a4489c1e08cb9d6a65a7e522e1632d66efc78b4f52f5194e1d28"} Apr 16 23:26:20.163301 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.163257 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" event={"ID":"4457788c-bfe7-45d0-9674-8966cbeef7a6","Type":"ContainerStarted","Data":"4826b976e7ac9ebf32287e646ee90b5d48efa3f98077d9bf82ee19c30a17f30d"} Apr 16 23:26:20.164342 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.164321 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dg2kq" event={"ID":"fa4aee53-dd23-4d2a-9b51-9a0d0822c22a","Type":"ContainerStarted","Data":"a1295d9b18cb6a9330afcafd307b2024145b5b942650be27853b2cf180e45e23"} Apr 16 23:26:20.165536 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.165518 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4kt5z" event={"ID":"5aea2741-baa5-486d-8a0f-9eef53a7f27a","Type":"ContainerStarted","Data":"4a46f86dd51e02d9eab6f52c09dbb6c4eb11b6913ba66300a2e4ff7f764b128c"} Apr 16 23:26:20.166794 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.166772 2564 generic.go:358] "Generic (PLEG): container finished" podID="24f90fad8c4786333edb9df978f6d7c6" containerID="5610554a827eb69caf644b01e8d451c63b134e08f0ea9a73e8618b4eb2712cba" exitCode=0 Apr 16 23:26:20.166873 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.166815 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal" event={"ID":"24f90fad8c4786333edb9df978f6d7c6","Type":"ContainerDied","Data":"5610554a827eb69caf644b01e8d451c63b134e08f0ea9a73e8618b4eb2712cba"} Apr 16 23:26:20.168002 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.167984 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wmsm7" event={"ID":"c7fc9336-0583-4750-b4d1-143cb0e8e3bf","Type":"ContainerStarted","Data":"a6863821868d699ff76aa2af0419e61618c659fcdda860ad4395ecd8d1c1a436"} Apr 16 23:26:20.169236 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.169216 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mwc9h" event={"ID":"80d77b0e-6b2a-4741-be2b-8b95c72b915e","Type":"ContainerStarted","Data":"41e87781b237f2156d09d1ea5120b4d864b76b0784426516a2c0627fc9ca9ff6"} Apr 16 23:26:20.202470 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.202418 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4kt5z" podStartSLOduration=2.3319937 podStartE2EDuration="20.202400593s" podCreationTimestamp="2026-04-16 23:26:00 +0000 UTC" firstStartedPulling="2026-04-16 23:26:01.228087229 +0000 UTC m=+1.844118685" lastFinishedPulling="2026-04-16 23:26:19.098494109 +0000 UTC m=+19.714525578" observedRunningTime="2026-04-16 23:26:20.187871982 +0000 UTC m=+20.803903462" watchObservedRunningTime="2026-04-16 23:26:20.202400593 +0000 UTC m=+20.818432073" Apr 16 23:26:20.202795 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.202743 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mwc9h" podStartSLOduration=2.370702477 podStartE2EDuration="20.202730244s" podCreationTimestamp="2026-04-16 23:26:00 +0000 UTC" firstStartedPulling="2026-04-16 23:26:01.277319903 +0000 UTC m=+1.893351364" lastFinishedPulling="2026-04-16 23:26:19.109347667 +0000 UTC m=+19.725379131" observedRunningTime="2026-04-16 23:26:20.202365981 +0000 UTC m=+20.818397462" watchObservedRunningTime="2026-04-16 23:26:20.202730244 +0000 UTC m=+20.818761725" Apr 16 23:26:20.213859 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.213762 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-43.ec2.internal" podStartSLOduration=20.213743491 podStartE2EDuration="20.213743491s" podCreationTimestamp="2026-04-16 23:26:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:26:20.213374902 +0000 UTC m=+20.829406383" watchObservedRunningTime="2026-04-16 23:26:20.213743491 +0000 UTC m=+20.829774972" Apr 16 23:26:20.226880 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.226831 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dg2kq" podStartSLOduration=10.529094917 podStartE2EDuration="20.226814319s" podCreationTimestamp="2026-04-16 23:26:00 +0000 UTC" firstStartedPulling="2026-04-16 23:26:01.231063436 +0000 UTC m=+1.847094893" lastFinishedPulling="2026-04-16 23:26:10.928782835 +0000 UTC m=+11.544814295" observedRunningTime="2026-04-16 23:26:20.226691627 +0000 UTC m=+20.842723107" watchObservedRunningTime="2026-04-16 23:26:20.226814319 +0000 UTC m=+20.842845798" Apr 16 23:26:20.241238 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.241177 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qh8gv" podStartSLOduration=2.151350846 podStartE2EDuration="20.241156809s" podCreationTimestamp="2026-04-16 23:26:00 +0000 UTC" firstStartedPulling="2026-04-16 23:26:01.009372159 +0000 UTC m=+1.625403616" lastFinishedPulling="2026-04-16 23:26:19.099178115 +0000 UTC m=+19.715209579" observedRunningTime="2026-04-16 23:26:20.24061813 +0000 UTC m=+20.856649645" watchObservedRunningTime="2026-04-16 23:26:20.241156809 +0000 UTC m=+20.857188293" Apr 16 23:26:20.259317 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.259268 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-wmsm7" podStartSLOduration=2.482704043 podStartE2EDuration="20.25925293s" podCreationTimestamp="2026-04-16 23:26:00 +0000 UTC" firstStartedPulling="2026-04-16 23:26:01.320847801 +0000 UTC m=+1.936879257" lastFinishedPulling="2026-04-16 23:26:19.097396686 +0000 UTC m=+19.713428144" observedRunningTime="2026-04-16 23:26:20.259210133 +0000 UTC m=+20.875241612" watchObservedRunningTime="2026-04-16 23:26:20.25925293 +0000 UTC m=+20.875284392" Apr 16 23:26:20.730548 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.730386 2564 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 23:26:20.948661 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.948523 2564 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T23:26:20.730544333Z","UUID":"afa5dd36-f4f0-49e9-81ce-0052f9d55fd0","Handler":null,"Name":"","Endpoint":""} Apr 16 23:26:20.951281 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.951262 2564 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 23:26:20.951405 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:20.951288 2564 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 23:26:21.173888 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:21.173799 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal" event={"ID":"24f90fad8c4786333edb9df978f6d7c6","Type":"ContainerStarted","Data":"1714b4c45390c378f3cf99bbf3dbb22b9c3bcbf80ee2ff0ad8058119c17c7703"} Apr 16 23:26:21.175988 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:21.175958 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wm22k" event={"ID":"7bc2b26c-3731-4818-b28c-4bb5ce01662b","Type":"ContainerStarted","Data":"3815c7763b9bdde933bdd904855419998e58b68c794d58a22277335f15859cde"} Apr 16 23:26:21.178837 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:21.178788 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" event={"ID":"331df142-711c-4252-8e63-0342d087918d","Type":"ContainerStarted","Data":"409cf61b444d570fb5306cc55f70f2a43b523b210605ea4fcd8c0097caed5482"} Apr 16 23:26:21.188116 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:21.188077 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-43.ec2.internal" podStartSLOduration=21.188063513 podStartE2EDuration="21.188063513s" podCreationTimestamp="2026-04-16 23:26:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:26:21.18727258 +0000 UTC m=+21.803304050" watchObservedRunningTime="2026-04-16 23:26:21.188063513 +0000 UTC m=+21.804094991" Apr 16 23:26:21.199534 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:21.199487 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wm22k" podStartSLOduration=3.362247422 podStartE2EDuration="21.199471489s" podCreationTimestamp="2026-04-16 23:26:00 +0000 UTC" firstStartedPulling="2026-04-16 23:26:01.25956737 +0000 UTC m=+1.875598827" lastFinishedPulling="2026-04-16 23:26:19.096791429 +0000 UTC m=+19.712822894" observedRunningTime="2026-04-16 23:26:21.198885033 +0000 UTC m=+21.814916514" watchObservedRunningTime="2026-04-16 23:26:21.199471489 +0000 UTC m=+21.815502967" Apr 16 23:26:21.978437 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:21.978367 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:21.978634 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:21.978380 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:21.978634 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:21.978494 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:21.978634 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:21.978591 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:22.182694 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:22.182669 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:26:22.183218 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:22.183071 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" event={"ID":"4457788c-bfe7-45d0-9674-8966cbeef7a6","Type":"ContainerStarted","Data":"523bec3a991b0fd8fce3c1fd0f696ebe0fc1c7c9b4fedcefa0d771b244b29ef6"} Apr 16 23:26:22.184967 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:22.184943 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" event={"ID":"331df142-711c-4252-8e63-0342d087918d","Type":"ContainerStarted","Data":"f32a763e04e07f80040c4f7da4e3955c7c7615f4ca5ab8e760e5fdb119de35bf"} Apr 16 23:26:23.408925 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:23.408891 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-wmsm7" Apr 16 23:26:23.410030 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:23.409999 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-wmsm7" Apr 16 23:26:23.423346 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:23.423297 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bkl7b" podStartSLOduration=3.014207434 podStartE2EDuration="23.423277338s" podCreationTimestamp="2026-04-16 23:26:00 +0000 UTC" firstStartedPulling="2026-04-16 23:26:01.208235323 +0000 UTC m=+1.824266783" lastFinishedPulling="2026-04-16 23:26:21.617305215 +0000 UTC m=+22.233336687" observedRunningTime="2026-04-16 23:26:22.199313971 +0000 UTC m=+22.815345450" watchObservedRunningTime="2026-04-16 23:26:23.423277338 +0000 UTC m=+24.039308817" Apr 16 23:26:23.984110 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:23.984074 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:23.984285 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:23.984080 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:23.984285 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:23.984206 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:23.984398 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:23.984282 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:24.188426 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:24.188352 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-wmsm7" Apr 16 23:26:24.188911 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:24.188892 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-wmsm7" Apr 16 23:26:25.193139 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:25.192980 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:26:25.193658 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:25.193498 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" event={"ID":"4457788c-bfe7-45d0-9674-8966cbeef7a6","Type":"ContainerStarted","Data":"c8ca5138ef43346780cc7a3117d8f8bfc7353b751de5d90b2200a2a09a053a8e"} Apr 16 23:26:25.193968 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:25.193950 2564 scope.go:117] "RemoveContainer" containerID="e4e7f05a50c1a4489c1e08cb9d6a65a7e522e1632d66efc78b4f52f5194e1d28" Apr 16 23:26:25.981069 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:25.981038 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:25.981223 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:25.981038 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:25.981223 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:25.981161 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:25.981348 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:25.981227 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:26.199478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:26.199393 2564 generic.go:358] "Generic (PLEG): container finished" podID="5c265860-ea8b-4315-acf6-bbb9ee728fe8" containerID="f2fb0ebd8851000803208158376c789d54a508dcb66a3ec37ddfd7ffa623753d" exitCode=0 Apr 16 23:26:26.199907 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:26.199490 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wnp9b" event={"ID":"5c265860-ea8b-4315-acf6-bbb9ee728fe8","Type":"ContainerDied","Data":"f2fb0ebd8851000803208158376c789d54a508dcb66a3ec37ddfd7ffa623753d"} Apr 16 23:26:26.203007 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:26.202992 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:26:26.203860 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:26.203831 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" event={"ID":"4457788c-bfe7-45d0-9674-8966cbeef7a6","Type":"ContainerStarted","Data":"edc214e13040c6093a834e8dca394b35f1b4000ceb22b406364fe4e2b7eb66e6"} Apr 16 23:26:26.205496 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:26.205474 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:26.205596 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:26.205518 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:26.205596 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:26.205533 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:26.222021 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:26.221994 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:26.222140 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:26.222062 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:26:26.241690 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:26.241645 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" podStartSLOduration=8.396688928 podStartE2EDuration="26.241630048s" podCreationTimestamp="2026-04-16 23:26:00 +0000 UTC" firstStartedPulling="2026-04-16 23:26:01.315121933 +0000 UTC m=+1.931153389" lastFinishedPulling="2026-04-16 23:26:19.160063049 +0000 UTC m=+19.776094509" observedRunningTime="2026-04-16 23:26:26.241088968 +0000 UTC m=+26.857120463" watchObservedRunningTime="2026-04-16 23:26:26.241630048 +0000 UTC m=+26.857661563" Apr 16 23:26:26.754460 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:26.754427 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zp26z"] Apr 16 23:26:26.754654 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:26.754578 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:26.754736 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:26.754692 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:26.756884 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:26.756855 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rl6cs"] Apr 16 23:26:26.757004 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:26.756970 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:26.757089 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:26.757063 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:27.978644 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:27.978610 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:27.979313 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:27.978774 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:28.210001 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:28.209965 2564 generic.go:358] "Generic (PLEG): container finished" podID="5c265860-ea8b-4315-acf6-bbb9ee728fe8" containerID="4290da4610004d762dc39d8941fb45aac474e6522bfe9fb26d2de203d345d07d" exitCode=0 Apr 16 23:26:28.210153 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:28.210028 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wnp9b" event={"ID":"5c265860-ea8b-4315-acf6-bbb9ee728fe8","Type":"ContainerDied","Data":"4290da4610004d762dc39d8941fb45aac474e6522bfe9fb26d2de203d345d07d"} Apr 16 23:26:28.978143 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:28.978112 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:28.978369 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:28.978240 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:29.978846 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:29.978813 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:29.979455 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:29.978952 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:30.215370 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:30.215339 2564 generic.go:358] "Generic (PLEG): container finished" podID="5c265860-ea8b-4315-acf6-bbb9ee728fe8" containerID="55111f2b2292f2d916cb00c69af999bd41f013d83713d4ba26bab430cf9732aa" exitCode=0 Apr 16 23:26:30.215534 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:30.215397 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wnp9b" event={"ID":"5c265860-ea8b-4315-acf6-bbb9ee728fe8","Type":"ContainerDied","Data":"55111f2b2292f2d916cb00c69af999bd41f013d83713d4ba26bab430cf9732aa"} Apr 16 23:26:30.978127 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:30.978096 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:30.978346 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:30.978195 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rl6cs" podUID="a35b971f-784e-46e9-b251-dbb9a720c52f" Apr 16 23:26:31.978049 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:31.977840 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:31.978503 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:31.978149 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:26:32.208350 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.208321 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-43.ec2.internal" event="NodeReady" Apr 16 23:26:32.208531 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.208485 2564 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 23:26:32.249131 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.249103 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cwjtr"] Apr 16 23:26:32.268846 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.268813 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fr2wb"] Apr 16 23:26:32.269029 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.269006 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:26:32.271252 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.271214 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-m6886\"" Apr 16 23:26:32.271252 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.271247 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 23:26:32.271687 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.271286 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 23:26:32.271687 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.271227 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 23:26:32.291901 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.291868 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cwjtr"] Apr 16 23:26:32.292086 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.291923 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fr2wb"] Apr 16 23:26:32.292086 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.291950 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:32.294154 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.294130 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 23:26:32.294274 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.294142 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 23:26:32.294274 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.294242 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-89jck\"" Apr 16 23:26:32.319614 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.319569 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:32.319614 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.319609 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert\") pod \"ingress-canary-cwjtr\" (UID: \"3dc4e703-91ac-44ae-9d1a-83214f2378fd\") " pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:26:32.319844 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.319717 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-758qs\" (UniqueName: \"kubernetes.io/projected/3dc4e703-91ac-44ae-9d1a-83214f2378fd-kube-api-access-758qs\") pod \"ingress-canary-cwjtr\" (UID: \"3dc4e703-91ac-44ae-9d1a-83214f2378fd\") " pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:26:32.319844 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.319779 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9efb52c5-c96d-422a-8c15-e03f71fdd622-tmp-dir\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:32.319844 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.319812 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l99k9\" (UniqueName: \"kubernetes.io/projected/9efb52c5-c96d-422a-8c15-e03f71fdd622-kube-api-access-l99k9\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:32.319982 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.319847 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9efb52c5-c96d-422a-8c15-e03f71fdd622-config-volume\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:32.420575 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.420537 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-758qs\" (UniqueName: \"kubernetes.io/projected/3dc4e703-91ac-44ae-9d1a-83214f2378fd-kube-api-access-758qs\") pod \"ingress-canary-cwjtr\" (UID: \"3dc4e703-91ac-44ae-9d1a-83214f2378fd\") " pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:26:32.420769 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.420591 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9efb52c5-c96d-422a-8c15-e03f71fdd622-tmp-dir\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:32.420769 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.420611 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l99k9\" (UniqueName: \"kubernetes.io/projected/9efb52c5-c96d-422a-8c15-e03f71fdd622-kube-api-access-l99k9\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:32.420769 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.420635 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9efb52c5-c96d-422a-8c15-e03f71fdd622-config-volume\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:32.420769 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.420669 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:32.420769 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.420687 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert\") pod \"ingress-canary-cwjtr\" (UID: \"3dc4e703-91ac-44ae-9d1a-83214f2378fd\") " pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:26:32.421049 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:32.420843 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:26:32.421049 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:32.420874 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:26:32.421049 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:32.420919 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls podName:9efb52c5-c96d-422a-8c15-e03f71fdd622 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:32.920896931 +0000 UTC m=+33.536928409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls") pod "dns-default-fr2wb" (UID: "9efb52c5-c96d-422a-8c15-e03f71fdd622") : secret "dns-default-metrics-tls" not found Apr 16 23:26:32.421049 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.420931 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9efb52c5-c96d-422a-8c15-e03f71fdd622-tmp-dir\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:32.421049 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:32.420939 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert podName:3dc4e703-91ac-44ae-9d1a-83214f2378fd nodeName:}" failed. No retries permitted until 2026-04-16 23:26:32.920929018 +0000 UTC m=+33.536960498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert") pod "ingress-canary-cwjtr" (UID: "3dc4e703-91ac-44ae-9d1a-83214f2378fd") : secret "canary-serving-cert" not found Apr 16 23:26:32.421259 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.421161 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9efb52c5-c96d-422a-8c15-e03f71fdd622-config-volume\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:32.435684 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.432743 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l99k9\" (UniqueName: \"kubernetes.io/projected/9efb52c5-c96d-422a-8c15-e03f71fdd622-kube-api-access-l99k9\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:32.435684 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.433403 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-758qs\" (UniqueName: \"kubernetes.io/projected/3dc4e703-91ac-44ae-9d1a-83214f2378fd-kube-api-access-758qs\") pod \"ingress-canary-cwjtr\" (UID: \"3dc4e703-91ac-44ae-9d1a-83214f2378fd\") " pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:26:32.622185 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.622090 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs\") pod \"network-metrics-daemon-zp26z\" (UID: \"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2\") " pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:32.622325 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:32.622268 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:32.622377 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:32.622344 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs podName:e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:04.622326848 +0000 UTC m=+65.238358305 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs") pod "network-metrics-daemon-zp26z" (UID: "e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:32.722455 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.722404 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2frd4\" (UniqueName: \"kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4\") pod \"network-check-target-rl6cs\" (UID: \"a35b971f-784e-46e9-b251-dbb9a720c52f\") " pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:32.722629 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:32.722600 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:26:32.722629 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:32.722625 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:26:32.722760 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:32.722639 2564 projected.go:194] Error preparing data for projected volume kube-api-access-2frd4 for pod openshift-network-diagnostics/network-check-target-rl6cs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:32.722760 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:32.722724 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4 podName:a35b971f-784e-46e9-b251-dbb9a720c52f nodeName:}" failed. No retries permitted until 2026-04-16 23:27:04.722686043 +0000 UTC m=+65.338717502 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-2frd4" (UniqueName: "kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4") pod "network-check-target-rl6cs" (UID: "a35b971f-784e-46e9-b251-dbb9a720c52f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:32.924236 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.924143 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:32.924236 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.924184 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert\") pod \"ingress-canary-cwjtr\" (UID: \"3dc4e703-91ac-44ae-9d1a-83214f2378fd\") " pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:26:32.924453 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:32.924304 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:26:32.924453 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:32.924375 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls podName:9efb52c5-c96d-422a-8c15-e03f71fdd622 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:33.924356802 +0000 UTC m=+34.540388262 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls") pod "dns-default-fr2wb" (UID: "9efb52c5-c96d-422a-8c15-e03f71fdd622") : secret "dns-default-metrics-tls" not found Apr 16 23:26:32.924453 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:32.924304 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:26:32.924453 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:32.924454 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert podName:3dc4e703-91ac-44ae-9d1a-83214f2378fd nodeName:}" failed. No retries permitted until 2026-04-16 23:26:33.924429795 +0000 UTC m=+34.540461256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert") pod "ingress-canary-cwjtr" (UID: "3dc4e703-91ac-44ae-9d1a-83214f2378fd") : secret "canary-serving-cert" not found Apr 16 23:26:32.978153 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.978111 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:26:32.980541 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.980519 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-grjf2\"" Apr 16 23:26:32.980654 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.980521 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 23:26:32.980759 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:32.980740 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 23:26:33.440207 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.440009 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bhg8n"] Apr 16 23:26:33.473763 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.473729 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bhg8n"] Apr 16 23:26:33.473911 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.473823 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bhg8n" Apr 16 23:26:33.476249 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.476223 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 23:26:33.527646 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.527612 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/30de2f08-6583-4f71-b865-e6f57f20268c-dbus\") pod \"global-pull-secret-syncer-bhg8n\" (UID: \"30de2f08-6583-4f71-b865-e6f57f20268c\") " pod="kube-system/global-pull-secret-syncer-bhg8n" Apr 16 23:26:33.527849 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.527662 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/30de2f08-6583-4f71-b865-e6f57f20268c-original-pull-secret\") pod \"global-pull-secret-syncer-bhg8n\" (UID: \"30de2f08-6583-4f71-b865-e6f57f20268c\") " pod="kube-system/global-pull-secret-syncer-bhg8n" Apr 16 23:26:33.527849 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.527722 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/30de2f08-6583-4f71-b865-e6f57f20268c-kubelet-config\") pod \"global-pull-secret-syncer-bhg8n\" (UID: \"30de2f08-6583-4f71-b865-e6f57f20268c\") " pod="kube-system/global-pull-secret-syncer-bhg8n" Apr 16 23:26:33.628447 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.628407 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/30de2f08-6583-4f71-b865-e6f57f20268c-kubelet-config\") pod \"global-pull-secret-syncer-bhg8n\" (UID: \"30de2f08-6583-4f71-b865-e6f57f20268c\") " pod="kube-system/global-pull-secret-syncer-bhg8n" Apr 16 23:26:33.628631 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.628501 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/30de2f08-6583-4f71-b865-e6f57f20268c-dbus\") pod \"global-pull-secret-syncer-bhg8n\" (UID: \"30de2f08-6583-4f71-b865-e6f57f20268c\") " pod="kube-system/global-pull-secret-syncer-bhg8n" Apr 16 23:26:33.628631 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.628520 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/30de2f08-6583-4f71-b865-e6f57f20268c-original-pull-secret\") pod \"global-pull-secret-syncer-bhg8n\" (UID: \"30de2f08-6583-4f71-b865-e6f57f20268c\") " pod="kube-system/global-pull-secret-syncer-bhg8n" Apr 16 23:26:33.628631 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.628542 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/30de2f08-6583-4f71-b865-e6f57f20268c-kubelet-config\") pod \"global-pull-secret-syncer-bhg8n\" (UID: \"30de2f08-6583-4f71-b865-e6f57f20268c\") " pod="kube-system/global-pull-secret-syncer-bhg8n" Apr 16 23:26:33.628768 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.628692 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/30de2f08-6583-4f71-b865-e6f57f20268c-dbus\") pod \"global-pull-secret-syncer-bhg8n\" (UID: \"30de2f08-6583-4f71-b865-e6f57f20268c\") " pod="kube-system/global-pull-secret-syncer-bhg8n" Apr 16 23:26:33.630840 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.630818 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/30de2f08-6583-4f71-b865-e6f57f20268c-original-pull-secret\") pod \"global-pull-secret-syncer-bhg8n\" (UID: \"30de2f08-6583-4f71-b865-e6f57f20268c\") " pod="kube-system/global-pull-secret-syncer-bhg8n" Apr 16 23:26:33.783048 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.783006 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bhg8n" Apr 16 23:26:33.930404 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.930369 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert\") pod \"ingress-canary-cwjtr\" (UID: \"3dc4e703-91ac-44ae-9d1a-83214f2378fd\") " pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:26:33.930576 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.930488 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:33.930576 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:33.930520 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:26:33.930681 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:33.930587 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:26:33.930681 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:33.930597 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert podName:3dc4e703-91ac-44ae-9d1a-83214f2378fd nodeName:}" failed. No retries permitted until 2026-04-16 23:26:35.930577588 +0000 UTC m=+36.546609045 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert") pod "ingress-canary-cwjtr" (UID: "3dc4e703-91ac-44ae-9d1a-83214f2378fd") : secret "canary-serving-cert" not found Apr 16 23:26:33.930681 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:33.930637 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls podName:9efb52c5-c96d-422a-8c15-e03f71fdd622 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:35.930621682 +0000 UTC m=+36.546653144 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls") pod "dns-default-fr2wb" (UID: "9efb52c5-c96d-422a-8c15-e03f71fdd622") : secret "dns-default-metrics-tls" not found Apr 16 23:26:33.960145 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.960113 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bhg8n"] Apr 16 23:26:33.966237 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:26:33.966200 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30de2f08_6583_4f71_b865_e6f57f20268c.slice/crio-eab1f769d48a1c37e85a21aaec6003fc9308307c9c89eaf8f895360d268fa6e4 WatchSource:0}: Error finding container eab1f769d48a1c37e85a21aaec6003fc9308307c9c89eaf8f895360d268fa6e4: Status 404 returned error can't find the container with id eab1f769d48a1c37e85a21aaec6003fc9308307c9c89eaf8f895360d268fa6e4 Apr 16 23:26:33.978607 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.978575 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:26:33.980863 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.980839 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 23:26:33.980977 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:33.980872 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wl6xg\"" Apr 16 23:26:34.224499 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:34.224456 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bhg8n" event={"ID":"30de2f08-6583-4f71-b865-e6f57f20268c","Type":"ContainerStarted","Data":"eab1f769d48a1c37e85a21aaec6003fc9308307c9c89eaf8f895360d268fa6e4"} Apr 16 23:26:35.947989 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:35.947933 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:35.947989 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:35.947983 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert\") pod \"ingress-canary-cwjtr\" (UID: \"3dc4e703-91ac-44ae-9d1a-83214f2378fd\") " pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:26:35.948526 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:35.948167 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:26:35.948526 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:35.948179 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:26:35.948526 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:35.948247 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls podName:9efb52c5-c96d-422a-8c15-e03f71fdd622 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:39.948228272 +0000 UTC m=+40.564259746 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls") pod "dns-default-fr2wb" (UID: "9efb52c5-c96d-422a-8c15-e03f71fdd622") : secret "dns-default-metrics-tls" not found Apr 16 23:26:35.948526 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:35.948266 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert podName:3dc4e703-91ac-44ae-9d1a-83214f2378fd nodeName:}" failed. No retries permitted until 2026-04-16 23:26:39.948257474 +0000 UTC m=+40.564288945 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert") pod "ingress-canary-cwjtr" (UID: "3dc4e703-91ac-44ae-9d1a-83214f2378fd") : secret "canary-serving-cert" not found Apr 16 23:26:39.978646 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:39.978392 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:39.979043 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:39.978671 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert\") pod \"ingress-canary-cwjtr\" (UID: \"3dc4e703-91ac-44ae-9d1a-83214f2378fd\") " pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:26:39.979043 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:39.978532 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:26:39.979043 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:39.978791 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls podName:9efb52c5-c96d-422a-8c15-e03f71fdd622 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:47.978769808 +0000 UTC m=+48.594801266 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls") pod "dns-default-fr2wb" (UID: "9efb52c5-c96d-422a-8c15-e03f71fdd622") : secret "dns-default-metrics-tls" not found Apr 16 23:26:39.979043 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:39.978819 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:26:39.979043 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:39.978867 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert podName:3dc4e703-91ac-44ae-9d1a-83214f2378fd nodeName:}" failed. No retries permitted until 2026-04-16 23:26:47.978851637 +0000 UTC m=+48.594883105 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert") pod "ingress-canary-cwjtr" (UID: "3dc4e703-91ac-44ae-9d1a-83214f2378fd") : secret "canary-serving-cert" not found Apr 16 23:26:40.239995 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:40.239902 2564 generic.go:358] "Generic (PLEG): container finished" podID="5c265860-ea8b-4315-acf6-bbb9ee728fe8" containerID="74053b54e7a80a90bc0528165baa78bdbb47399d564a92c253f2b4327f85307f" exitCode=0 Apr 16 23:26:40.239995 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:40.239977 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wnp9b" event={"ID":"5c265860-ea8b-4315-acf6-bbb9ee728fe8","Type":"ContainerDied","Data":"74053b54e7a80a90bc0528165baa78bdbb47399d564a92c253f2b4327f85307f"} Apr 16 23:26:40.241283 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:40.241255 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bhg8n" event={"ID":"30de2f08-6583-4f71-b865-e6f57f20268c","Type":"ContainerStarted","Data":"e0c0db3994d2058d697825041e7e0582f2617deeb5078078798de7ec5f17bafb"} Apr 16 23:26:40.273519 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:40.273457 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bhg8n" podStartSLOduration=1.841227825 podStartE2EDuration="7.273436578s" podCreationTimestamp="2026-04-16 23:26:33 +0000 UTC" firstStartedPulling="2026-04-16 23:26:33.968342764 +0000 UTC m=+34.584374240" lastFinishedPulling="2026-04-16 23:26:39.400551519 +0000 UTC m=+40.016582993" observedRunningTime="2026-04-16 23:26:40.27272597 +0000 UTC m=+40.888757450" watchObservedRunningTime="2026-04-16 23:26:40.273436578 +0000 UTC m=+40.889468061" Apr 16 23:26:41.245541 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:41.245503 2564 generic.go:358] "Generic (PLEG): container finished" podID="5c265860-ea8b-4315-acf6-bbb9ee728fe8" containerID="cb8c20ba941e38453ad82d9b223b279833720f8a7a8ffc80a4cddf4261fd03d8" exitCode=0 Apr 16 23:26:41.245926 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:41.245584 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wnp9b" event={"ID":"5c265860-ea8b-4315-acf6-bbb9ee728fe8","Type":"ContainerDied","Data":"cb8c20ba941e38453ad82d9b223b279833720f8a7a8ffc80a4cddf4261fd03d8"} Apr 16 23:26:42.250860 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:42.250819 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wnp9b" event={"ID":"5c265860-ea8b-4315-acf6-bbb9ee728fe8","Type":"ContainerStarted","Data":"7db8895d62c09f97006cddf6888d4a51f436dd82512cc51610392290878f1eba"} Apr 16 23:26:42.271904 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:42.271845 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wnp9b" podStartSLOduration=4.122694991 podStartE2EDuration="42.271824154s" podCreationTimestamp="2026-04-16 23:26:00 +0000 UTC" firstStartedPulling="2026-04-16 23:26:01.244773843 +0000 UTC m=+1.860805300" lastFinishedPulling="2026-04-16 23:26:39.393902996 +0000 UTC m=+40.009934463" observedRunningTime="2026-04-16 23:26:42.270658445 +0000 UTC m=+42.886689923" watchObservedRunningTime="2026-04-16 23:26:42.271824154 +0000 UTC m=+42.887855632" Apr 16 23:26:48.036136 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:48.036092 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:26:48.036136 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:48.036135 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert\") pod \"ingress-canary-cwjtr\" (UID: \"3dc4e703-91ac-44ae-9d1a-83214f2378fd\") " pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:26:48.036530 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:48.036232 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:26:48.036530 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:48.036235 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:26:48.036530 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:48.036283 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert podName:3dc4e703-91ac-44ae-9d1a-83214f2378fd nodeName:}" failed. No retries permitted until 2026-04-16 23:27:04.036268534 +0000 UTC m=+64.652299992 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert") pod "ingress-canary-cwjtr" (UID: "3dc4e703-91ac-44ae-9d1a-83214f2378fd") : secret "canary-serving-cert" not found Apr 16 23:26:48.036530 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:26:48.036296 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls podName:9efb52c5-c96d-422a-8c15-e03f71fdd622 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:04.036290205 +0000 UTC m=+64.652321662 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls") pod "dns-default-fr2wb" (UID: "9efb52c5-c96d-422a-8c15-e03f71fdd622") : secret "dns-default-metrics-tls" not found Apr 16 23:26:58.228616 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:26:58.228579 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-md28x" Apr 16 23:27:04.051236 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:04.051177 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:27:04.051236 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:04.051231 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert\") pod \"ingress-canary-cwjtr\" (UID: \"3dc4e703-91ac-44ae-9d1a-83214f2378fd\") " pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:27:04.051748 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:27:04.051325 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:27:04.051748 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:27:04.051336 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:27:04.051748 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:27:04.051374 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert podName:3dc4e703-91ac-44ae-9d1a-83214f2378fd nodeName:}" failed. No retries permitted until 2026-04-16 23:27:36.051360279 +0000 UTC m=+96.667391736 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert") pod "ingress-canary-cwjtr" (UID: "3dc4e703-91ac-44ae-9d1a-83214f2378fd") : secret "canary-serving-cert" not found Apr 16 23:27:04.051748 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:27:04.051407 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls podName:9efb52c5-c96d-422a-8c15-e03f71fdd622 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:36.051384072 +0000 UTC m=+96.667415536 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls") pod "dns-default-fr2wb" (UID: "9efb52c5-c96d-422a-8c15-e03f71fdd622") : secret "dns-default-metrics-tls" not found Apr 16 23:27:04.655578 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:04.655513 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs\") pod \"network-metrics-daemon-zp26z\" (UID: \"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2\") " pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:27:04.657745 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:04.657725 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 23:27:04.666471 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:27:04.666446 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 23:27:04.666550 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:27:04.666508 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs podName:e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:08.666493302 +0000 UTC m=+129.282524759 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs") pod "network-metrics-daemon-zp26z" (UID: "e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2") : secret "metrics-daemon-secret" not found Apr 16 23:27:04.755965 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:04.755919 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2frd4\" (UniqueName: \"kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4\") pod \"network-check-target-rl6cs\" (UID: \"a35b971f-784e-46e9-b251-dbb9a720c52f\") " pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:27:04.758530 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:04.758510 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 23:27:04.768130 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:04.768110 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 23:27:04.780021 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:04.780001 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2frd4\" (UniqueName: \"kubernetes.io/projected/a35b971f-784e-46e9-b251-dbb9a720c52f-kube-api-access-2frd4\") pod \"network-check-target-rl6cs\" (UID: \"a35b971f-784e-46e9-b251-dbb9a720c52f\") " pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:27:04.788803 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:04.788783 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-grjf2\"" Apr 16 23:27:04.797423 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:04.797405 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:27:04.924590 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:04.924518 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rl6cs"] Apr 16 23:27:04.927917 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:27:04.927875 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda35b971f_784e_46e9_b251_dbb9a720c52f.slice/crio-9934b39d9099161eb3cebe0f3ae5f94ec18c64eda15ca2be205ba179ee4917a7 WatchSource:0}: Error finding container 9934b39d9099161eb3cebe0f3ae5f94ec18c64eda15ca2be205ba179ee4917a7: Status 404 returned error can't find the container with id 9934b39d9099161eb3cebe0f3ae5f94ec18c64eda15ca2be205ba179ee4917a7 Apr 16 23:27:05.297459 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:05.297428 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rl6cs" event={"ID":"a35b971f-784e-46e9-b251-dbb9a720c52f","Type":"ContainerStarted","Data":"9934b39d9099161eb3cebe0f3ae5f94ec18c64eda15ca2be205ba179ee4917a7"} Apr 16 23:27:08.304303 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:08.304267 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rl6cs" event={"ID":"a35b971f-784e-46e9-b251-dbb9a720c52f","Type":"ContainerStarted","Data":"47e8e4f606eeec63969c34ef15c40e7d89e902e67cd9ce53a2f6e9a92678a006"} Apr 16 23:27:08.304683 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:08.304471 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:27:08.317519 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:08.317474 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rl6cs" podStartSLOduration=65.481068135 podStartE2EDuration="1m8.317459975s" podCreationTimestamp="2026-04-16 23:26:00 +0000 UTC" firstStartedPulling="2026-04-16 23:27:04.929829678 +0000 UTC m=+65.545861135" lastFinishedPulling="2026-04-16 23:27:07.766221505 +0000 UTC m=+68.382252975" observedRunningTime="2026-04-16 23:27:08.317207433 +0000 UTC m=+68.933238917" watchObservedRunningTime="2026-04-16 23:27:08.317459975 +0000 UTC m=+68.933491445" Apr 16 23:27:36.068714 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:36.068660 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:27:36.068714 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:36.068726 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert\") pod \"ingress-canary-cwjtr\" (UID: \"3dc4e703-91ac-44ae-9d1a-83214f2378fd\") " pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:27:36.069235 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:27:36.068816 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:27:36.069235 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:27:36.068819 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:27:36.069235 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:27:36.068865 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert podName:3dc4e703-91ac-44ae-9d1a-83214f2378fd nodeName:}" failed. No retries permitted until 2026-04-16 23:28:40.068852493 +0000 UTC m=+160.684883950 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert") pod "ingress-canary-cwjtr" (UID: "3dc4e703-91ac-44ae-9d1a-83214f2378fd") : secret "canary-serving-cert" not found Apr 16 23:27:36.069235 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:27:36.068878 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls podName:9efb52c5-c96d-422a-8c15-e03f71fdd622 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:40.06887197 +0000 UTC m=+160.684903427 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls") pod "dns-default-fr2wb" (UID: "9efb52c5-c96d-422a-8c15-e03f71fdd622") : secret "dns-default-metrics-tls" not found Apr 16 23:27:39.307882 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:39.307848 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rl6cs" Apr 16 23:27:58.884517 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:58.884482 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-69rmt"] Apr 16 23:27:58.888724 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:58.888688 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:58.890718 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:58.890668 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 23:27:58.890862 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:58.890785 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 23:27:58.890862 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:58.890825 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-sp6bs\"" Apr 16 23:27:58.890862 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:58.890826 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 23:27:58.891599 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:58.891577 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 23:27:58.893969 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:58.893949 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-69rmt"] Apr 16 23:27:58.895195 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:58.895166 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 23:27:58.928950 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:58.928902 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a5316185-790a-4a40-b230-e6cc6cc0b80b-tmp\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:58.929099 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:58.928964 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a5316185-790a-4a40-b230-e6cc6cc0b80b-snapshots\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:58.929099 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:58.929053 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5316185-790a-4a40-b230-e6cc6cc0b80b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:58.929099 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:58.929079 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5316185-790a-4a40-b230-e6cc6cc0b80b-serving-cert\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:58.929233 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:58.929105 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5316185-790a-4a40-b230-e6cc6cc0b80b-service-ca-bundle\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:58.929233 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:58.929127 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh6j5\" (UniqueName: \"kubernetes.io/projected/a5316185-790a-4a40-b230-e6cc6cc0b80b-kube-api-access-jh6j5\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:59.029879 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:59.029840 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5316185-790a-4a40-b230-e6cc6cc0b80b-service-ca-bundle\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:59.029879 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:59.029881 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jh6j5\" (UniqueName: \"kubernetes.io/projected/a5316185-790a-4a40-b230-e6cc6cc0b80b-kube-api-access-jh6j5\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:59.030137 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:59.029911 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a5316185-790a-4a40-b230-e6cc6cc0b80b-tmp\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:59.030137 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:59.029943 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a5316185-790a-4a40-b230-e6cc6cc0b80b-snapshots\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:59.030137 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:59.029985 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5316185-790a-4a40-b230-e6cc6cc0b80b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:59.030137 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:59.030001 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5316185-790a-4a40-b230-e6cc6cc0b80b-serving-cert\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:59.030358 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:59.030340 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a5316185-790a-4a40-b230-e6cc6cc0b80b-tmp\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:59.030577 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:59.030544 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5316185-790a-4a40-b230-e6cc6cc0b80b-service-ca-bundle\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:59.030724 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:59.030645 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a5316185-790a-4a40-b230-e6cc6cc0b80b-snapshots\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:59.030945 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:59.030925 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5316185-790a-4a40-b230-e6cc6cc0b80b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:59.032275 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:59.032254 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5316185-790a-4a40-b230-e6cc6cc0b80b-serving-cert\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:59.036988 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:59.036966 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh6j5\" (UniqueName: \"kubernetes.io/projected/a5316185-790a-4a40-b230-e6cc6cc0b80b-kube-api-access-jh6j5\") pod \"insights-operator-585dfdc468-69rmt\" (UID: \"a5316185-790a-4a40-b230-e6cc6cc0b80b\") " pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:59.198281 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:59.198167 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-69rmt" Apr 16 23:27:59.308190 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:59.308157 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-69rmt"] Apr 16 23:27:59.311770 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:27:59.311739 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5316185_790a_4a40_b230_e6cc6cc0b80b.slice/crio-4b36259c79557b86b3a29674d1cf13f14f0a3f7ddf3d639d2079da6256fd1f11 WatchSource:0}: Error finding container 4b36259c79557b86b3a29674d1cf13f14f0a3f7ddf3d639d2079da6256fd1f11: Status 404 returned error can't find the container with id 4b36259c79557b86b3a29674d1cf13f14f0a3f7ddf3d639d2079da6256fd1f11 Apr 16 23:27:59.400597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:27:59.400558 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-69rmt" event={"ID":"a5316185-790a-4a40-b230-e6cc6cc0b80b","Type":"ContainerStarted","Data":"4b36259c79557b86b3a29674d1cf13f14f0a3f7ddf3d639d2079da6256fd1f11"} Apr 16 23:28:02.408072 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:02.408033 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-69rmt" event={"ID":"a5316185-790a-4a40-b230-e6cc6cc0b80b","Type":"ContainerStarted","Data":"d8305379514916e55559ef5f41eac1f5c23d28616d57e3efeb30a968b2003b23"} Apr 16 23:28:02.426017 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:02.425972 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-69rmt" podStartSLOduration=2.227708642 podStartE2EDuration="4.425958279s" podCreationTimestamp="2026-04-16 23:27:58 +0000 UTC" firstStartedPulling="2026-04-16 23:27:59.313565302 +0000 UTC m=+119.929596761" lastFinishedPulling="2026-04-16 23:28:01.511814938 +0000 UTC m=+122.127846398" observedRunningTime="2026-04-16 23:28:02.424681858 +0000 UTC m=+123.040713356" watchObservedRunningTime="2026-04-16 23:28:02.425958279 +0000 UTC m=+123.041989759" Apr 16 23:28:04.116657 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:04.116629 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4kt5z_5aea2741-baa5-486d-8a0f-9eef53a7f27a/dns-node-resolver/0.log" Apr 16 23:28:05.117334 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:05.117304 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dg2kq_fa4aee53-dd23-4d2a-9b51-9a0d0822c22a/node-ca/0.log" Apr 16 23:28:08.700229 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.700171 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs\") pod \"network-metrics-daemon-zp26z\" (UID: \"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2\") " pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:28:08.700684 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:08.700343 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 23:28:08.700684 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:08.700432 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs podName:e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2 nodeName:}" failed. No retries permitted until 2026-04-16 23:30:10.700409921 +0000 UTC m=+251.316441397 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs") pod "network-metrics-daemon-zp26z" (UID: "e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2") : secret "metrics-daemon-secret" not found Apr 16 23:28:08.863548 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.863513 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8ldlc"] Apr 16 23:28:08.866357 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.866341 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:08.868275 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.868244 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 23:28:08.868389 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.868369 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-pvxhw\"" Apr 16 23:28:08.868604 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.868585 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:28:08.868604 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.868598 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 23:28:08.869142 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.869125 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 23:28:08.873421 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.873403 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8ldlc"] Apr 16 23:28:08.875060 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.875037 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 23:28:08.901348 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.901322 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed5b2be5-9e2a-419f-989a-30f08a0e3d57-trusted-ca\") pod \"console-operator-9d4b6777b-8ldlc\" (UID: \"ed5b2be5-9e2a-419f-989a-30f08a0e3d57\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:08.901447 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.901360 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed5b2be5-9e2a-419f-989a-30f08a0e3d57-config\") pod \"console-operator-9d4b6777b-8ldlc\" (UID: \"ed5b2be5-9e2a-419f-989a-30f08a0e3d57\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:08.901447 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.901377 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l288n\" (UniqueName: \"kubernetes.io/projected/ed5b2be5-9e2a-419f-989a-30f08a0e3d57-kube-api-access-l288n\") pod \"console-operator-9d4b6777b-8ldlc\" (UID: \"ed5b2be5-9e2a-419f-989a-30f08a0e3d57\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:08.901519 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.901456 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed5b2be5-9e2a-419f-989a-30f08a0e3d57-serving-cert\") pod \"console-operator-9d4b6777b-8ldlc\" (UID: \"ed5b2be5-9e2a-419f-989a-30f08a0e3d57\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:08.965757 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.965732 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr"] Apr 16 23:28:08.968408 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.968392 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" Apr 16 23:28:08.970361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.970322 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 23:28:08.970488 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.970373 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 23:28:08.970552 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.970530 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:28:08.970623 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.970607 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 23:28:08.970680 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.970646 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-hvrrx\"" Apr 16 23:28:08.975828 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:08.975806 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr"] Apr 16 23:28:09.002253 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.002229 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdw77\" (UniqueName: \"kubernetes.io/projected/2665ad6e-102c-40fd-8fac-4e7fdd52738a-kube-api-access-kdw77\") pod \"kube-storage-version-migrator-operator-6769c5d45-67vgr\" (UID: \"2665ad6e-102c-40fd-8fac-4e7fdd52738a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" Apr 16 23:28:09.002385 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.002282 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed5b2be5-9e2a-419f-989a-30f08a0e3d57-trusted-ca\") pod \"console-operator-9d4b6777b-8ldlc\" (UID: \"ed5b2be5-9e2a-419f-989a-30f08a0e3d57\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:09.002385 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.002301 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2665ad6e-102c-40fd-8fac-4e7fdd52738a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-67vgr\" (UID: \"2665ad6e-102c-40fd-8fac-4e7fdd52738a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" Apr 16 23:28:09.002385 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.002359 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed5b2be5-9e2a-419f-989a-30f08a0e3d57-config\") pod \"console-operator-9d4b6777b-8ldlc\" (UID: \"ed5b2be5-9e2a-419f-989a-30f08a0e3d57\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:09.002385 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.002382 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l288n\" (UniqueName: \"kubernetes.io/projected/ed5b2be5-9e2a-419f-989a-30f08a0e3d57-kube-api-access-l288n\") pod \"console-operator-9d4b6777b-8ldlc\" (UID: \"ed5b2be5-9e2a-419f-989a-30f08a0e3d57\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:09.002585 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.002417 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed5b2be5-9e2a-419f-989a-30f08a0e3d57-serving-cert\") pod \"console-operator-9d4b6777b-8ldlc\" (UID: \"ed5b2be5-9e2a-419f-989a-30f08a0e3d57\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:09.002585 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.002466 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2665ad6e-102c-40fd-8fac-4e7fdd52738a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-67vgr\" (UID: \"2665ad6e-102c-40fd-8fac-4e7fdd52738a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" Apr 16 23:28:09.003023 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.002994 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed5b2be5-9e2a-419f-989a-30f08a0e3d57-trusted-ca\") pod \"console-operator-9d4b6777b-8ldlc\" (UID: \"ed5b2be5-9e2a-419f-989a-30f08a0e3d57\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:09.003118 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.003031 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed5b2be5-9e2a-419f-989a-30f08a0e3d57-config\") pod \"console-operator-9d4b6777b-8ldlc\" (UID: \"ed5b2be5-9e2a-419f-989a-30f08a0e3d57\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:09.004655 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.004632 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed5b2be5-9e2a-419f-989a-30f08a0e3d57-serving-cert\") pod \"console-operator-9d4b6777b-8ldlc\" (UID: \"ed5b2be5-9e2a-419f-989a-30f08a0e3d57\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:09.010032 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.009987 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l288n\" (UniqueName: \"kubernetes.io/projected/ed5b2be5-9e2a-419f-989a-30f08a0e3d57-kube-api-access-l288n\") pod \"console-operator-9d4b6777b-8ldlc\" (UID: \"ed5b2be5-9e2a-419f-989a-30f08a0e3d57\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:09.103789 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.103756 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2665ad6e-102c-40fd-8fac-4e7fdd52738a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-67vgr\" (UID: \"2665ad6e-102c-40fd-8fac-4e7fdd52738a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" Apr 16 23:28:09.103946 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.103813 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2665ad6e-102c-40fd-8fac-4e7fdd52738a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-67vgr\" (UID: \"2665ad6e-102c-40fd-8fac-4e7fdd52738a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" Apr 16 23:28:09.103946 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.103841 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdw77\" (UniqueName: \"kubernetes.io/projected/2665ad6e-102c-40fd-8fac-4e7fdd52738a-kube-api-access-kdw77\") pod \"kube-storage-version-migrator-operator-6769c5d45-67vgr\" (UID: \"2665ad6e-102c-40fd-8fac-4e7fdd52738a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" Apr 16 23:28:09.104412 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.104380 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2665ad6e-102c-40fd-8fac-4e7fdd52738a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-67vgr\" (UID: \"2665ad6e-102c-40fd-8fac-4e7fdd52738a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" Apr 16 23:28:09.106000 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.105976 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2665ad6e-102c-40fd-8fac-4e7fdd52738a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-67vgr\" (UID: \"2665ad6e-102c-40fd-8fac-4e7fdd52738a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" Apr 16 23:28:09.110482 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.110447 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdw77\" (UniqueName: \"kubernetes.io/projected/2665ad6e-102c-40fd-8fac-4e7fdd52738a-kube-api-access-kdw77\") pod \"kube-storage-version-migrator-operator-6769c5d45-67vgr\" (UID: \"2665ad6e-102c-40fd-8fac-4e7fdd52738a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" Apr 16 23:28:09.175945 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.175909 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:09.277999 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.277968 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" Apr 16 23:28:09.288582 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.288557 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8ldlc"] Apr 16 23:28:09.292802 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:28:09.292775 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded5b2be5_9e2a_419f_989a_30f08a0e3d57.slice/crio-4122ccc71c936a1b5ab5f2d7bed812afc4ad0c45144bdda5e5a4914898f0295b WatchSource:0}: Error finding container 4122ccc71c936a1b5ab5f2d7bed812afc4ad0c45144bdda5e5a4914898f0295b: Status 404 returned error can't find the container with id 4122ccc71c936a1b5ab5f2d7bed812afc4ad0c45144bdda5e5a4914898f0295b Apr 16 23:28:09.391017 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.390984 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr"] Apr 16 23:28:09.393863 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:28:09.393833 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2665ad6e_102c_40fd_8fac_4e7fdd52738a.slice/crio-b1afb530aa13c18c42f4d04538423d48df8f2142d89e7b8d87cf78543fcdc5bc WatchSource:0}: Error finding container b1afb530aa13c18c42f4d04538423d48df8f2142d89e7b8d87cf78543fcdc5bc: Status 404 returned error can't find the container with id b1afb530aa13c18c42f4d04538423d48df8f2142d89e7b8d87cf78543fcdc5bc Apr 16 23:28:09.423174 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.423139 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" event={"ID":"2665ad6e-102c-40fd-8fac-4e7fdd52738a","Type":"ContainerStarted","Data":"b1afb530aa13c18c42f4d04538423d48df8f2142d89e7b8d87cf78543fcdc5bc"} Apr 16 23:28:09.424069 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:09.424045 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" event={"ID":"ed5b2be5-9e2a-419f-989a-30f08a0e3d57","Type":"ContainerStarted","Data":"4122ccc71c936a1b5ab5f2d7bed812afc4ad0c45144bdda5e5a4914898f0295b"} Apr 16 23:28:12.433101 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:12.433068 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/0.log" Apr 16 23:28:12.433534 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:12.433113 2564 generic.go:358] "Generic (PLEG): container finished" podID="ed5b2be5-9e2a-419f-989a-30f08a0e3d57" containerID="9126df5e0cdbe143e9162af3d99dd77029c26e3ff2ce38b3ed2a4ea5b951d750" exitCode=255 Apr 16 23:28:12.433534 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:12.433205 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" event={"ID":"ed5b2be5-9e2a-419f-989a-30f08a0e3d57","Type":"ContainerDied","Data":"9126df5e0cdbe143e9162af3d99dd77029c26e3ff2ce38b3ed2a4ea5b951d750"} Apr 16 23:28:12.433534 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:12.433443 2564 scope.go:117] "RemoveContainer" containerID="9126df5e0cdbe143e9162af3d99dd77029c26e3ff2ce38b3ed2a4ea5b951d750" Apr 16 23:28:12.434547 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:12.434522 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" event={"ID":"2665ad6e-102c-40fd-8fac-4e7fdd52738a","Type":"ContainerStarted","Data":"c712793f9b6d56940b3d528cb78b2bef7c30d47ea250ee50a3db7dd585927f94"} Apr 16 23:28:12.460168 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:12.460126 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" podStartSLOduration=2.047651002 podStartE2EDuration="4.460110062s" podCreationTimestamp="2026-04-16 23:28:08 +0000 UTC" firstStartedPulling="2026-04-16 23:28:09.395635914 +0000 UTC m=+130.011667371" lastFinishedPulling="2026-04-16 23:28:11.808094968 +0000 UTC m=+132.424126431" observedRunningTime="2026-04-16 23:28:12.459547878 +0000 UTC m=+133.075579356" watchObservedRunningTime="2026-04-16 23:28:12.460110062 +0000 UTC m=+133.076141536" Apr 16 23:28:12.948834 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:12.948796 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-4ljv8"] Apr 16 23:28:12.951896 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:12.951878 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4ljv8" Apr 16 23:28:12.954205 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:12.954180 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-4cpsp\"" Apr 16 23:28:12.954314 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:12.954217 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 23:28:12.954544 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:12.954529 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 23:28:12.963513 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:12.963494 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-4ljv8"] Apr 16 23:28:13.037241 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:13.037198 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ksd7\" (UniqueName: \"kubernetes.io/projected/ea4856cb-ab54-40ff-8382-60807f91deea-kube-api-access-5ksd7\") pod \"migrator-74bb7799d9-4ljv8\" (UID: \"ea4856cb-ab54-40ff-8382-60807f91deea\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4ljv8" Apr 16 23:28:13.138030 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:13.137992 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ksd7\" (UniqueName: \"kubernetes.io/projected/ea4856cb-ab54-40ff-8382-60807f91deea-kube-api-access-5ksd7\") pod \"migrator-74bb7799d9-4ljv8\" (UID: \"ea4856cb-ab54-40ff-8382-60807f91deea\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4ljv8" Apr 16 23:28:13.144966 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:13.144937 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ksd7\" (UniqueName: \"kubernetes.io/projected/ea4856cb-ab54-40ff-8382-60807f91deea-kube-api-access-5ksd7\") pod \"migrator-74bb7799d9-4ljv8\" (UID: \"ea4856cb-ab54-40ff-8382-60807f91deea\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4ljv8" Apr 16 23:28:13.261021 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:13.260986 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4ljv8" Apr 16 23:28:13.377762 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:13.377732 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-4ljv8"] Apr 16 23:28:13.381076 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:28:13.381050 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea4856cb_ab54_40ff_8382_60807f91deea.slice/crio-17cea384724b6174ad0d5ae7cda2885ea007cc4d1df09bb527f8cbee8d627201 WatchSource:0}: Error finding container 17cea384724b6174ad0d5ae7cda2885ea007cc4d1df09bb527f8cbee8d627201: Status 404 returned error can't find the container with id 17cea384724b6174ad0d5ae7cda2885ea007cc4d1df09bb527f8cbee8d627201 Apr 16 23:28:13.438767 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:13.438741 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/1.log" Apr 16 23:28:13.439162 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:13.439106 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/0.log" Apr 16 23:28:13.439162 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:13.439137 2564 generic.go:358] "Generic (PLEG): container finished" podID="ed5b2be5-9e2a-419f-989a-30f08a0e3d57" containerID="46b59f059d926ad81b8a0c1087a86ceabefd29af29888e4fe3cb9883f7f381c0" exitCode=255 Apr 16 23:28:13.439245 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:13.439169 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" event={"ID":"ed5b2be5-9e2a-419f-989a-30f08a0e3d57","Type":"ContainerDied","Data":"46b59f059d926ad81b8a0c1087a86ceabefd29af29888e4fe3cb9883f7f381c0"} Apr 16 23:28:13.439245 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:13.439210 2564 scope.go:117] "RemoveContainer" containerID="9126df5e0cdbe143e9162af3d99dd77029c26e3ff2ce38b3ed2a4ea5b951d750" Apr 16 23:28:13.439486 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:13.439461 2564 scope.go:117] "RemoveContainer" containerID="46b59f059d926ad81b8a0c1087a86ceabefd29af29888e4fe3cb9883f7f381c0" Apr 16 23:28:13.439671 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:13.439648 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8ldlc_openshift-console-operator(ed5b2be5-9e2a-419f-989a-30f08a0e3d57)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" podUID="ed5b2be5-9e2a-419f-989a-30f08a0e3d57" Apr 16 23:28:13.440497 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:13.440473 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4ljv8" event={"ID":"ea4856cb-ab54-40ff-8382-60807f91deea","Type":"ContainerStarted","Data":"17cea384724b6174ad0d5ae7cda2885ea007cc4d1df09bb527f8cbee8d627201"} Apr 16 23:28:14.446711 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:14.446666 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/1.log" Apr 16 23:28:14.447158 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:14.447143 2564 scope.go:117] "RemoveContainer" containerID="46b59f059d926ad81b8a0c1087a86ceabefd29af29888e4fe3cb9883f7f381c0" Apr 16 23:28:14.447359 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:14.447333 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8ldlc_openshift-console-operator(ed5b2be5-9e2a-419f-989a-30f08a0e3d57)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" podUID="ed5b2be5-9e2a-419f-989a-30f08a0e3d57" Apr 16 23:28:15.049414 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.049385 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gwhc8"] Apr 16 23:28:15.052182 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.052167 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gwhc8" Apr 16 23:28:15.054110 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.054090 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-6l42h\"" Apr 16 23:28:15.059224 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.059204 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gwhc8"] Apr 16 23:28:15.153186 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.153141 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf8wc\" (UniqueName: \"kubernetes.io/projected/d2fcd77b-9751-44d5-a36b-64111dfec87c-kube-api-access-cf8wc\") pod \"network-check-source-8894fc9bd-gwhc8\" (UID: \"d2fcd77b-9751-44d5-a36b-64111dfec87c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gwhc8" Apr 16 23:28:15.253791 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.253760 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cf8wc\" (UniqueName: \"kubernetes.io/projected/d2fcd77b-9751-44d5-a36b-64111dfec87c-kube-api-access-cf8wc\") pod \"network-check-source-8894fc9bd-gwhc8\" (UID: \"d2fcd77b-9751-44d5-a36b-64111dfec87c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gwhc8" Apr 16 23:28:15.261092 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.261069 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf8wc\" (UniqueName: \"kubernetes.io/projected/d2fcd77b-9751-44d5-a36b-64111dfec87c-kube-api-access-cf8wc\") pod \"network-check-source-8894fc9bd-gwhc8\" (UID: \"d2fcd77b-9751-44d5-a36b-64111dfec87c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gwhc8" Apr 16 23:28:15.361416 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.361324 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gwhc8" Apr 16 23:28:15.450866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.450798 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4ljv8" event={"ID":"ea4856cb-ab54-40ff-8382-60807f91deea","Type":"ContainerStarted","Data":"2992dbb35ee67ed23455aed509bf24ffb2dd4ac10e5895da01d7331c03361e08"} Apr 16 23:28:15.450866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.450842 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4ljv8" event={"ID":"ea4856cb-ab54-40ff-8382-60807f91deea","Type":"ContainerStarted","Data":"e72cb5f8b3ca2ee304993fb86657790956b338ef3580146a48aff4e8a6d99d4c"} Apr 16 23:28:15.465097 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.465055 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4ljv8" podStartSLOduration=2.227419694 podStartE2EDuration="3.465039688s" podCreationTimestamp="2026-04-16 23:28:12 +0000 UTC" firstStartedPulling="2026-04-16 23:28:13.383418374 +0000 UTC m=+133.999449832" lastFinishedPulling="2026-04-16 23:28:14.621038369 +0000 UTC m=+135.237069826" observedRunningTime="2026-04-16 23:28:15.464358029 +0000 UTC m=+136.080389505" watchObservedRunningTime="2026-04-16 23:28:15.465039688 +0000 UTC m=+136.081071167" Apr 16 23:28:15.481999 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.481968 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-79f5857d9-2hkvf"] Apr 16 23:28:15.486078 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.486062 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.489109 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.489084 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xwc2q\"" Apr 16 23:28:15.489217 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.489142 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 23:28:15.489491 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.489478 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 23:28:15.489853 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.489834 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 23:28:15.497622 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.497270 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gwhc8"] Apr 16 23:28:15.497764 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.497734 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 23:28:15.502056 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:28:15.502030 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2fcd77b_9751_44d5_a36b_64111dfec87c.slice/crio-711ed427a019db8cfd42bc5883e131dfea7497780d3006ee277e3ad171a7265b WatchSource:0}: Error finding container 711ed427a019db8cfd42bc5883e131dfea7497780d3006ee277e3ad171a7265b: Status 404 returned error can't find the container with id 711ed427a019db8cfd42bc5883e131dfea7497780d3006ee277e3ad171a7265b Apr 16 23:28:15.504603 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.504585 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-79f5857d9-2hkvf"] Apr 16 23:28:15.556368 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.556345 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6ba0269-a448-409c-a5ac-b14dea6a67bf-trusted-ca\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.556477 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.556388 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkvxg\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-kube-api-access-tkvxg\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.556477 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.556407 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-bound-sa-token\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.556595 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.556488 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6ba0269-a448-409c-a5ac-b14dea6a67bf-installation-pull-secrets\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.556595 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.556534 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6ba0269-a448-409c-a5ac-b14dea6a67bf-ca-trust-extracted\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.556595 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.556567 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-certificates\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.556757 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.556631 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.556757 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.556663 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f6ba0269-a448-409c-a5ac-b14dea6a67bf-image-registry-private-configuration\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.657829 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.657737 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6ba0269-a448-409c-a5ac-b14dea6a67bf-ca-trust-extracted\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.657829 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.657776 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-certificates\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.657829 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.657810 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.658094 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.657859 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f6ba0269-a448-409c-a5ac-b14dea6a67bf-image-registry-private-configuration\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.658094 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.657894 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6ba0269-a448-409c-a5ac-b14dea6a67bf-trusted-ca\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.658094 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.657930 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkvxg\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-kube-api-access-tkvxg\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.658094 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:15.657972 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:28:15.658094 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:15.657996 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79f5857d9-2hkvf: secret "image-registry-tls" not found Apr 16 23:28:15.658094 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:15.658058 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls podName:f6ba0269-a448-409c-a5ac-b14dea6a67bf nodeName:}" failed. No retries permitted until 2026-04-16 23:28:16.158036521 +0000 UTC m=+136.774068003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls") pod "image-registry-79f5857d9-2hkvf" (UID: "f6ba0269-a448-409c-a5ac-b14dea6a67bf") : secret "image-registry-tls" not found Apr 16 23:28:15.658424 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.658185 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-bound-sa-token\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.658424 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.658232 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6ba0269-a448-409c-a5ac-b14dea6a67bf-installation-pull-secrets\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.658800 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.658777 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6ba0269-a448-409c-a5ac-b14dea6a67bf-ca-trust-extracted\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.659095 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.659060 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-certificates\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.659517 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.659491 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6ba0269-a448-409c-a5ac-b14dea6a67bf-trusted-ca\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.660374 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.660347 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f6ba0269-a448-409c-a5ac-b14dea6a67bf-image-registry-private-configuration\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.660774 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.660752 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6ba0269-a448-409c-a5ac-b14dea6a67bf-installation-pull-secrets\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.666141 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.666122 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-bound-sa-token\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.666236 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.666175 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkvxg\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-kube-api-access-tkvxg\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:15.749614 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.749573 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-n9vr7"] Apr 16 23:28:15.754012 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.753990 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:15.756127 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.756093 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 23:28:15.756417 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.756161 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jrpnr\"" Apr 16 23:28:15.756417 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.756163 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 23:28:15.760993 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.760933 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n9vr7"] Apr 16 23:28:15.860673 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.860632 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk68m\" (UniqueName: \"kubernetes.io/projected/03a8050d-e2eb-4c21-91c0-a18c9baefeca-kube-api-access-zk68m\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:15.860883 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.860681 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/03a8050d-e2eb-4c21-91c0-a18c9baefeca-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:15.860883 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.860718 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/03a8050d-e2eb-4c21-91c0-a18c9baefeca-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:15.860991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.860891 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/03a8050d-e2eb-4c21-91c0-a18c9baefeca-crio-socket\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:15.860991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.860925 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/03a8050d-e2eb-4c21-91c0-a18c9baefeca-data-volume\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:15.962102 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.962069 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/03a8050d-e2eb-4c21-91c0-a18c9baefeca-crio-socket\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:15.962277 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.962115 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/03a8050d-e2eb-4c21-91c0-a18c9baefeca-data-volume\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:15.962277 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.962153 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zk68m\" (UniqueName: \"kubernetes.io/projected/03a8050d-e2eb-4c21-91c0-a18c9baefeca-kube-api-access-zk68m\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:15.962277 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.962192 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/03a8050d-e2eb-4c21-91c0-a18c9baefeca-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:15.962277 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.962190 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/03a8050d-e2eb-4c21-91c0-a18c9baefeca-crio-socket\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:15.962505 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.962382 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/03a8050d-e2eb-4c21-91c0-a18c9baefeca-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:15.962565 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:15.962517 2564 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 23:28:15.962611 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.962557 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/03a8050d-e2eb-4c21-91c0-a18c9baefeca-data-volume\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:15.962611 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:15.962579 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03a8050d-e2eb-4c21-91c0-a18c9baefeca-insights-runtime-extractor-tls podName:03a8050d-e2eb-4c21-91c0-a18c9baefeca nodeName:}" failed. No retries permitted until 2026-04-16 23:28:16.462563877 +0000 UTC m=+137.078595333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/03a8050d-e2eb-4c21-91c0-a18c9baefeca-insights-runtime-extractor-tls") pod "insights-runtime-extractor-n9vr7" (UID: "03a8050d-e2eb-4c21-91c0-a18c9baefeca") : secret "insights-runtime-extractor-tls" not found Apr 16 23:28:15.962864 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.962846 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/03a8050d-e2eb-4c21-91c0-a18c9baefeca-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:15.973104 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:15.973084 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk68m\" (UniqueName: \"kubernetes.io/projected/03a8050d-e2eb-4c21-91c0-a18c9baefeca-kube-api-access-zk68m\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:16.164389 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:16.164351 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:16.164590 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:16.164523 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:28:16.164590 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:16.164547 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79f5857d9-2hkvf: secret "image-registry-tls" not found Apr 16 23:28:16.164734 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:16.164617 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls podName:f6ba0269-a448-409c-a5ac-b14dea6a67bf nodeName:}" failed. No retries permitted until 2026-04-16 23:28:17.164597006 +0000 UTC m=+137.780628463 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls") pod "image-registry-79f5857d9-2hkvf" (UID: "f6ba0269-a448-409c-a5ac-b14dea6a67bf") : secret "image-registry-tls" not found Apr 16 23:28:16.454546 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:16.454511 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gwhc8" event={"ID":"d2fcd77b-9751-44d5-a36b-64111dfec87c","Type":"ContainerStarted","Data":"53c18146bf3752e827d099c9473633f51f75979948bb72dcd751be001c1a2eaa"} Apr 16 23:28:16.454929 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:16.454557 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gwhc8" event={"ID":"d2fcd77b-9751-44d5-a36b-64111dfec87c","Type":"ContainerStarted","Data":"711ed427a019db8cfd42bc5883e131dfea7497780d3006ee277e3ad171a7265b"} Apr 16 23:28:16.466236 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:16.466211 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/03a8050d-e2eb-4c21-91c0-a18c9baefeca-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:16.466387 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:16.466368 2564 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 23:28:16.466449 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:16.466439 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03a8050d-e2eb-4c21-91c0-a18c9baefeca-insights-runtime-extractor-tls podName:03a8050d-e2eb-4c21-91c0-a18c9baefeca nodeName:}" failed. No retries permitted until 2026-04-16 23:28:17.466420017 +0000 UTC m=+138.082451478 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/03a8050d-e2eb-4c21-91c0-a18c9baefeca-insights-runtime-extractor-tls") pod "insights-runtime-extractor-n9vr7" (UID: "03a8050d-e2eb-4c21-91c0-a18c9baefeca") : secret "insights-runtime-extractor-tls" not found Apr 16 23:28:16.468628 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:16.468592 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gwhc8" podStartSLOduration=1.4685799990000001 podStartE2EDuration="1.468579999s" podCreationTimestamp="2026-04-16 23:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:28:16.467243769 +0000 UTC m=+137.083275248" watchObservedRunningTime="2026-04-16 23:28:16.468579999 +0000 UTC m=+137.084611525" Apr 16 23:28:17.171408 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:17.171369 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:17.171574 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:17.171552 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:28:17.171614 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:17.171579 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79f5857d9-2hkvf: secret "image-registry-tls" not found Apr 16 23:28:17.171664 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:17.171635 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls podName:f6ba0269-a448-409c-a5ac-b14dea6a67bf nodeName:}" failed. No retries permitted until 2026-04-16 23:28:19.171618293 +0000 UTC m=+139.787649749 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls") pod "image-registry-79f5857d9-2hkvf" (UID: "f6ba0269-a448-409c-a5ac-b14dea6a67bf") : secret "image-registry-tls" not found Apr 16 23:28:17.473387 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:17.473360 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/03a8050d-e2eb-4c21-91c0-a18c9baefeca-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:17.473751 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:17.473503 2564 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 23:28:17.473751 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:17.473567 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03a8050d-e2eb-4c21-91c0-a18c9baefeca-insights-runtime-extractor-tls podName:03a8050d-e2eb-4c21-91c0-a18c9baefeca nodeName:}" failed. No retries permitted until 2026-04-16 23:28:19.473548443 +0000 UTC m=+140.089579903 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/03a8050d-e2eb-4c21-91c0-a18c9baefeca-insights-runtime-extractor-tls") pod "insights-runtime-extractor-n9vr7" (UID: "03a8050d-e2eb-4c21-91c0-a18c9baefeca") : secret "insights-runtime-extractor-tls" not found Apr 16 23:28:19.176268 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:19.176230 2564 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:19.176268 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:19.176272 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:19.176678 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:19.176615 2564 scope.go:117] "RemoveContainer" containerID="46b59f059d926ad81b8a0c1087a86ceabefd29af29888e4fe3cb9883f7f381c0" Apr 16 23:28:19.176822 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:19.176804 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8ldlc_openshift-console-operator(ed5b2be5-9e2a-419f-989a-30f08a0e3d57)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" podUID="ed5b2be5-9e2a-419f-989a-30f08a0e3d57" Apr 16 23:28:19.187348 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:19.187328 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:19.187481 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:19.187461 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:28:19.187515 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:19.187483 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79f5857d9-2hkvf: secret "image-registry-tls" not found Apr 16 23:28:19.187550 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:19.187528 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls podName:f6ba0269-a448-409c-a5ac-b14dea6a67bf nodeName:}" failed. No retries permitted until 2026-04-16 23:28:23.187513393 +0000 UTC m=+143.803544850 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls") pod "image-registry-79f5857d9-2hkvf" (UID: "f6ba0269-a448-409c-a5ac-b14dea6a67bf") : secret "image-registry-tls" not found Apr 16 23:28:19.489859 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:19.489825 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/03a8050d-e2eb-4c21-91c0-a18c9baefeca-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:19.490017 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:19.489969 2564 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 23:28:19.490056 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:19.490037 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03a8050d-e2eb-4c21-91c0-a18c9baefeca-insights-runtime-extractor-tls podName:03a8050d-e2eb-4c21-91c0-a18c9baefeca nodeName:}" failed. No retries permitted until 2026-04-16 23:28:23.490020213 +0000 UTC m=+144.106051669 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/03a8050d-e2eb-4c21-91c0-a18c9baefeca-insights-runtime-extractor-tls") pod "insights-runtime-extractor-n9vr7" (UID: "03a8050d-e2eb-4c21-91c0-a18c9baefeca") : secret "insights-runtime-extractor-tls" not found Apr 16 23:28:23.220557 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:23.220527 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:23.222955 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:23.222930 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls\") pod \"image-registry-79f5857d9-2hkvf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:23.299195 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:23.299163 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:23.415216 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:23.415179 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-79f5857d9-2hkvf"] Apr 16 23:28:23.418234 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:28:23.418205 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6ba0269_a448_409c_a5ac_b14dea6a67bf.slice/crio-b1b40809ef2629821184fd92f26689ad836fb5d1948523547647662e11ff184b WatchSource:0}: Error finding container b1b40809ef2629821184fd92f26689ad836fb5d1948523547647662e11ff184b: Status 404 returned error can't find the container with id b1b40809ef2629821184fd92f26689ad836fb5d1948523547647662e11ff184b Apr 16 23:28:23.472574 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:23.472543 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" event={"ID":"f6ba0269-a448-409c-a5ac-b14dea6a67bf","Type":"ContainerStarted","Data":"b1b40809ef2629821184fd92f26689ad836fb5d1948523547647662e11ff184b"} Apr 16 23:28:23.523523 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:23.523488 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/03a8050d-e2eb-4c21-91c0-a18c9baefeca-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:23.525838 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:23.525818 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/03a8050d-e2eb-4c21-91c0-a18c9baefeca-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n9vr7\" (UID: \"03a8050d-e2eb-4c21-91c0-a18c9baefeca\") " pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:23.563868 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:23.563832 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n9vr7" Apr 16 23:28:23.678758 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:23.678723 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n9vr7"] Apr 16 23:28:23.682738 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:28:23.682689 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03a8050d_e2eb_4c21_91c0_a18c9baefeca.slice/crio-f189d4a2bfbbf5a248d700f67b6cdfa913e72c3c9e251f744f79a8d10b3c9abd WatchSource:0}: Error finding container f189d4a2bfbbf5a248d700f67b6cdfa913e72c3c9e251f744f79a8d10b3c9abd: Status 404 returned error can't find the container with id f189d4a2bfbbf5a248d700f67b6cdfa913e72c3c9e251f744f79a8d10b3c9abd Apr 16 23:28:24.476447 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:24.476411 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" event={"ID":"f6ba0269-a448-409c-a5ac-b14dea6a67bf","Type":"ContainerStarted","Data":"b9f7d3a2dae83929970388e85783cdd8822fc112e47fb7d6b93a7b05866dcf49"} Apr 16 23:28:24.476943 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:24.476492 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:24.477970 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:24.477941 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n9vr7" event={"ID":"03a8050d-e2eb-4c21-91c0-a18c9baefeca","Type":"ContainerStarted","Data":"3569025ef63a9cb1ed3941223f03a1e93e193cac902cf6584efaa2814ac505d3"} Apr 16 23:28:24.477970 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:24.477970 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n9vr7" event={"ID":"03a8050d-e2eb-4c21-91c0-a18c9baefeca","Type":"ContainerStarted","Data":"f189d4a2bfbbf5a248d700f67b6cdfa913e72c3c9e251f744f79a8d10b3c9abd"} Apr 16 23:28:24.493358 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:24.493309 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" podStartSLOduration=9.493290882 podStartE2EDuration="9.493290882s" podCreationTimestamp="2026-04-16 23:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:28:24.492188727 +0000 UTC m=+145.108220231" watchObservedRunningTime="2026-04-16 23:28:24.493290882 +0000 UTC m=+145.109322365" Apr 16 23:28:25.482451 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:25.482402 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n9vr7" event={"ID":"03a8050d-e2eb-4c21-91c0-a18c9baefeca","Type":"ContainerStarted","Data":"b36b22732bf4695a81c71a411b95c4541a98181d90bc64d01480ee2025a62527"} Apr 16 23:28:26.486023 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:26.485994 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n9vr7" event={"ID":"03a8050d-e2eb-4c21-91c0-a18c9baefeca","Type":"ContainerStarted","Data":"fe81e6cf64ca8d22d012f9006da25ae531fd3f2c2bd4bbd582a2a64a47746ff8"} Apr 16 23:28:26.501731 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:26.501668 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-n9vr7" podStartSLOduration=9.270896913 podStartE2EDuration="11.501653989s" podCreationTimestamp="2026-04-16 23:28:15 +0000 UTC" firstStartedPulling="2026-04-16 23:28:23.745062849 +0000 UTC m=+144.361094313" lastFinishedPulling="2026-04-16 23:28:25.975819932 +0000 UTC m=+146.591851389" observedRunningTime="2026-04-16 23:28:26.500978842 +0000 UTC m=+147.117010322" watchObservedRunningTime="2026-04-16 23:28:26.501653989 +0000 UTC m=+147.117685467" Apr 16 23:28:30.978935 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:30.978904 2564 scope.go:117] "RemoveContainer" containerID="46b59f059d926ad81b8a0c1087a86ceabefd29af29888e4fe3cb9883f7f381c0" Apr 16 23:28:31.501661 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:31.501636 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 16 23:28:31.502000 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:31.501985 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/1.log" Apr 16 23:28:31.502059 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:31.502018 2564 generic.go:358] "Generic (PLEG): container finished" podID="ed5b2be5-9e2a-419f-989a-30f08a0e3d57" containerID="4e54f717c73a180f66f1634a5620d34eb39f8fe11ba4d0b3f8e4502917730d1f" exitCode=255 Apr 16 23:28:31.502059 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:31.502047 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" event={"ID":"ed5b2be5-9e2a-419f-989a-30f08a0e3d57","Type":"ContainerDied","Data":"4e54f717c73a180f66f1634a5620d34eb39f8fe11ba4d0b3f8e4502917730d1f"} Apr 16 23:28:31.502126 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:31.502075 2564 scope.go:117] "RemoveContainer" containerID="46b59f059d926ad81b8a0c1087a86ceabefd29af29888e4fe3cb9883f7f381c0" Apr 16 23:28:31.502397 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:31.502374 2564 scope.go:117] "RemoveContainer" containerID="4e54f717c73a180f66f1634a5620d34eb39f8fe11ba4d0b3f8e4502917730d1f" Apr 16 23:28:31.502593 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:31.502573 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-8ldlc_openshift-console-operator(ed5b2be5-9e2a-419f-989a-30f08a0e3d57)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" podUID="ed5b2be5-9e2a-419f-989a-30f08a0e3d57" Apr 16 23:28:32.506038 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:32.506012 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 16 23:28:33.899796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:33.899765 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-79f5857d9-2hkvf"] Apr 16 23:28:35.280295 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:35.280249 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-cwjtr" podUID="3dc4e703-91ac-44ae-9d1a-83214f2378fd" Apr 16 23:28:35.317179 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:35.317133 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-fr2wb" podUID="9efb52c5-c96d-422a-8c15-e03f71fdd622" Apr 16 23:28:35.514605 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:35.514566 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:28:35.514801 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:35.514566 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fr2wb" Apr 16 23:28:36.989418 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:36.989377 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-zp26z" podUID="e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2" Apr 16 23:28:37.355165 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.355063 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-bzvl5"] Apr 16 23:28:37.371478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.371440 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-bzvl5"] Apr 16 23:28:37.371610 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.371594 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" Apr 16 23:28:37.374856 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.374829 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 23:28:37.375008 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.374863 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 23:28:37.375008 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.374829 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 23:28:37.375008 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.374829 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 23:28:37.375008 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.374843 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-r9mss\"" Apr 16 23:28:37.375216 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.375034 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 23:28:37.539551 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.539517 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grrps\" (UniqueName: \"kubernetes.io/projected/44ca12aa-7211-4889-91a2-95139cba7d7d-kube-api-access-grrps\") pod \"prometheus-operator-5676c8c784-bzvl5\" (UID: \"44ca12aa-7211-4889-91a2-95139cba7d7d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" Apr 16 23:28:37.539749 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.539560 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44ca12aa-7211-4889-91a2-95139cba7d7d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-bzvl5\" (UID: \"44ca12aa-7211-4889-91a2-95139cba7d7d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" Apr 16 23:28:37.539749 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.539588 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44ca12aa-7211-4889-91a2-95139cba7d7d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-bzvl5\" (UID: \"44ca12aa-7211-4889-91a2-95139cba7d7d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" Apr 16 23:28:37.539749 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.539655 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/44ca12aa-7211-4889-91a2-95139cba7d7d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bzvl5\" (UID: \"44ca12aa-7211-4889-91a2-95139cba7d7d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" Apr 16 23:28:37.641027 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.640941 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/44ca12aa-7211-4889-91a2-95139cba7d7d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bzvl5\" (UID: \"44ca12aa-7211-4889-91a2-95139cba7d7d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" Apr 16 23:28:37.641027 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.641022 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grrps\" (UniqueName: \"kubernetes.io/projected/44ca12aa-7211-4889-91a2-95139cba7d7d-kube-api-access-grrps\") pod \"prometheus-operator-5676c8c784-bzvl5\" (UID: \"44ca12aa-7211-4889-91a2-95139cba7d7d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" Apr 16 23:28:37.641254 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.641052 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44ca12aa-7211-4889-91a2-95139cba7d7d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-bzvl5\" (UID: \"44ca12aa-7211-4889-91a2-95139cba7d7d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" Apr 16 23:28:37.641254 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.641082 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44ca12aa-7211-4889-91a2-95139cba7d7d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-bzvl5\" (UID: \"44ca12aa-7211-4889-91a2-95139cba7d7d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" Apr 16 23:28:37.641798 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.641781 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44ca12aa-7211-4889-91a2-95139cba7d7d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-bzvl5\" (UID: \"44ca12aa-7211-4889-91a2-95139cba7d7d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" Apr 16 23:28:37.643438 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.643407 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44ca12aa-7211-4889-91a2-95139cba7d7d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-bzvl5\" (UID: \"44ca12aa-7211-4889-91a2-95139cba7d7d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" Apr 16 23:28:37.643553 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.643507 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/44ca12aa-7211-4889-91a2-95139cba7d7d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bzvl5\" (UID: \"44ca12aa-7211-4889-91a2-95139cba7d7d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" Apr 16 23:28:37.648504 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.648481 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grrps\" (UniqueName: \"kubernetes.io/projected/44ca12aa-7211-4889-91a2-95139cba7d7d-kube-api-access-grrps\") pod \"prometheus-operator-5676c8c784-bzvl5\" (UID: \"44ca12aa-7211-4889-91a2-95139cba7d7d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" Apr 16 23:28:37.681324 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.681289 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" Apr 16 23:28:37.793189 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:37.793157 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-bzvl5"] Apr 16 23:28:37.796553 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:28:37.796526 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44ca12aa_7211_4889_91a2_95139cba7d7d.slice/crio-1530c70fc931c1ec66937d412edc3e049283868afa747005da852ed3a1932b08 WatchSource:0}: Error finding container 1530c70fc931c1ec66937d412edc3e049283868afa747005da852ed3a1932b08: Status 404 returned error can't find the container with id 1530c70fc931c1ec66937d412edc3e049283868afa747005da852ed3a1932b08 Apr 16 23:28:38.523622 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:38.523586 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" event={"ID":"44ca12aa-7211-4889-91a2-95139cba7d7d","Type":"ContainerStarted","Data":"1530c70fc931c1ec66937d412edc3e049283868afa747005da852ed3a1932b08"} Apr 16 23:28:39.176461 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:39.176429 2564 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:39.176573 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:39.176483 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:39.176953 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:39.176935 2564 scope.go:117] "RemoveContainer" containerID="4e54f717c73a180f66f1634a5620d34eb39f8fe11ba4d0b3f8e4502917730d1f" Apr 16 23:28:39.177191 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:39.177165 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-8ldlc_openshift-console-operator(ed5b2be5-9e2a-419f-989a-30f08a0e3d57)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" podUID="ed5b2be5-9e2a-419f-989a-30f08a0e3d57" Apr 16 23:28:39.527494 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:39.527462 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" event={"ID":"44ca12aa-7211-4889-91a2-95139cba7d7d","Type":"ContainerStarted","Data":"778e2f4d8cb64bc5eb119a0bb5bce261081bda68b3f383c899ecb93f1fd1185b"} Apr 16 23:28:39.527494 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:39.527497 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" event={"ID":"44ca12aa-7211-4889-91a2-95139cba7d7d","Type":"ContainerStarted","Data":"5e0a3fc52d8fb3d19ba0ddaaed57563eb6cdce19a15d739491e983ae29af0bc1"} Apr 16 23:28:39.543127 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:39.543073 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-bzvl5" podStartSLOduration=1.201239262 podStartE2EDuration="2.543057622s" podCreationTimestamp="2026-04-16 23:28:37 +0000 UTC" firstStartedPulling="2026-04-16 23:28:37.798547796 +0000 UTC m=+158.414579254" lastFinishedPulling="2026-04-16 23:28:39.140366153 +0000 UTC m=+159.756397614" observedRunningTime="2026-04-16 23:28:39.542429557 +0000 UTC m=+160.158461034" watchObservedRunningTime="2026-04-16 23:28:39.543057622 +0000 UTC m=+160.159089101" Apr 16 23:28:40.161863 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:40.161817 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert\") pod \"ingress-canary-cwjtr\" (UID: \"3dc4e703-91ac-44ae-9d1a-83214f2378fd\") " pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:28:40.162060 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:40.161907 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:28:40.164269 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:40.164237 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dc4e703-91ac-44ae-9d1a-83214f2378fd-cert\") pod \"ingress-canary-cwjtr\" (UID: \"3dc4e703-91ac-44ae-9d1a-83214f2378fd\") " pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:28:40.164269 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:40.164248 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9efb52c5-c96d-422a-8c15-e03f71fdd622-metrics-tls\") pod \"dns-default-fr2wb\" (UID: \"9efb52c5-c96d-422a-8c15-e03f71fdd622\") " pod="openshift-dns/dns-default-fr2wb" Apr 16 23:28:40.318014 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:40.317981 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-89jck\"" Apr 16 23:28:40.318713 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:40.318679 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-m6886\"" Apr 16 23:28:40.326330 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:40.326301 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cwjtr" Apr 16 23:28:40.326403 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:40.326389 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fr2wb" Apr 16 23:28:40.451457 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:40.451427 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fr2wb"] Apr 16 23:28:40.455530 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:28:40.455501 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9efb52c5_c96d_422a_8c15_e03f71fdd622.slice/crio-0c1d7d5320ac5935ee13ab8f5acb5306a080adc94e1a567c5983a151c6f0feaa WatchSource:0}: Error finding container 0c1d7d5320ac5935ee13ab8f5acb5306a080adc94e1a567c5983a151c6f0feaa: Status 404 returned error can't find the container with id 0c1d7d5320ac5935ee13ab8f5acb5306a080adc94e1a567c5983a151c6f0feaa Apr 16 23:28:40.467762 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:40.467736 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cwjtr"] Apr 16 23:28:40.470730 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:28:40.470686 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dc4e703_91ac_44ae_9d1a_83214f2378fd.slice/crio-f295df31740e042caa04e729c070cc371947dfea22941175d6d785d5f89159ee WatchSource:0}: Error finding container f295df31740e042caa04e729c070cc371947dfea22941175d6d785d5f89159ee: Status 404 returned error can't find the container with id f295df31740e042caa04e729c070cc371947dfea22941175d6d785d5f89159ee Apr 16 23:28:40.531206 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:40.531172 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fr2wb" event={"ID":"9efb52c5-c96d-422a-8c15-e03f71fdd622","Type":"ContainerStarted","Data":"0c1d7d5320ac5935ee13ab8f5acb5306a080adc94e1a567c5983a151c6f0feaa"} Apr 16 23:28:40.532292 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:40.532267 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cwjtr" event={"ID":"3dc4e703-91ac-44ae-9d1a-83214f2378fd","Type":"ContainerStarted","Data":"f295df31740e042caa04e729c070cc371947dfea22941175d6d785d5f89159ee"} Apr 16 23:28:41.668482 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.668450 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58"] Apr 16 23:28:41.671888 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.671866 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" Apr 16 23:28:41.674068 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.674043 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-qxd76\"" Apr 16 23:28:41.674068 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.674063 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 23:28:41.674263 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.674158 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 23:28:41.676265 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.676246 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-f47hd"] Apr 16 23:28:41.679360 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.679338 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.682517 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.682456 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 23:28:41.682646 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.682534 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 23:28:41.682762 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.682656 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 23:28:41.682856 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.682788 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sfntm\"" Apr 16 23:28:41.683296 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.683263 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58"] Apr 16 23:28:41.691299 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.691275 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-t99s8"] Apr 16 23:28:41.694558 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.694541 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.697125 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.697102 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 23:28:41.697238 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.697132 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 23:28:41.697238 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.697134 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-z8sb7\"" Apr 16 23:28:41.697238 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.697147 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 23:28:41.705199 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.705177 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-t99s8"] Apr 16 23:28:41.776807 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.776759 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/52c47db8-79a8-405f-b69b-00344f1bd3b2-root\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.776807 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.776824 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-textfile\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.777071 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.776865 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.777071 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.776903 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9588a074-c48c-4252-990c-1585ba91f39e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7mr58\" (UID: \"9588a074-c48c-4252-990c-1585ba91f39e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" Apr 16 23:28:41.777071 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.776962 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/52c47db8-79a8-405f-b69b-00344f1bd3b2-metrics-client-ca\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.777071 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.777001 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9588a074-c48c-4252-990c-1585ba91f39e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7mr58\" (UID: \"9588a074-c48c-4252-990c-1585ba91f39e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" Apr 16 23:28:41.777071 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.777040 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtpvp\" (UniqueName: \"kubernetes.io/projected/52c47db8-79a8-405f-b69b-00344f1bd3b2-kube-api-access-wtpvp\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.777291 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.777076 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9588a074-c48c-4252-990c-1585ba91f39e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7mr58\" (UID: \"9588a074-c48c-4252-990c-1585ba91f39e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" Apr 16 23:28:41.777291 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.777122 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-tls\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.777291 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.777176 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52c47db8-79a8-405f-b69b-00344f1bd3b2-sys\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.777291 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.777224 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5dk2\" (UniqueName: \"kubernetes.io/projected/9588a074-c48c-4252-990c-1585ba91f39e-kube-api-access-g5dk2\") pod \"openshift-state-metrics-9d44df66c-7mr58\" (UID: \"9588a074-c48c-4252-990c-1585ba91f39e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" Apr 16 23:28:41.777291 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.777248 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-wtmp\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.777451 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.777301 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-accelerators-collector-config\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.878773 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.878734 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/52c47db8-79a8-405f-b69b-00344f1bd3b2-root\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.878966 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.878799 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-textfile\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.878966 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.878832 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d9ce457-6a1e-4495-8955-15be5d126952-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.878966 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.878862 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.878966 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.878895 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9588a074-c48c-4252-990c-1585ba91f39e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7mr58\" (UID: \"9588a074-c48c-4252-990c-1585ba91f39e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" Apr 16 23:28:41.878966 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.878926 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/52c47db8-79a8-405f-b69b-00344f1bd3b2-metrics-client-ca\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.878966 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.878966 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9588a074-c48c-4252-990c-1585ba91f39e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7mr58\" (UID: \"9588a074-c48c-4252-990c-1585ba91f39e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" Apr 16 23:28:41.879273 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.878967 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/52c47db8-79a8-405f-b69b-00344f1bd3b2-root\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.879273 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.878996 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82l86\" (UniqueName: \"kubernetes.io/projected/5d9ce457-6a1e-4495-8955-15be5d126952-kube-api-access-82l86\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.879273 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.879034 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtpvp\" (UniqueName: \"kubernetes.io/projected/52c47db8-79a8-405f-b69b-00344f1bd3b2-kube-api-access-wtpvp\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.879273 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.879068 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9588a074-c48c-4252-990c-1585ba91f39e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7mr58\" (UID: \"9588a074-c48c-4252-990c-1585ba91f39e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" Apr 16 23:28:41.879273 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.879114 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-tls\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.879273 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.879155 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52c47db8-79a8-405f-b69b-00344f1bd3b2-sys\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.879273 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.879184 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5dk2\" (UniqueName: \"kubernetes.io/projected/9588a074-c48c-4252-990c-1585ba91f39e-kube-api-access-g5dk2\") pod \"openshift-state-metrics-9d44df66c-7mr58\" (UID: \"9588a074-c48c-4252-990c-1585ba91f39e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" Apr 16 23:28:41.879273 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.879207 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-wtmp\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.879273 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.879233 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d9ce457-6a1e-4495-8955-15be5d126952-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.879273 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.879265 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/5d9ce457-6a1e-4495-8955-15be5d126952-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.879779 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.879322 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-accelerators-collector-config\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.879779 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.879357 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5d9ce457-6a1e-4495-8955-15be5d126952-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.879779 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.879416 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/5d9ce457-6a1e-4495-8955-15be5d126952-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.879779 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:41.879557 2564 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 23:28:41.879779 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:41.879624 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-tls podName:52c47db8-79a8-405f-b69b-00344f1bd3b2 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:42.379604421 +0000 UTC m=+162.995635880 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-tls") pod "node-exporter-f47hd" (UID: "52c47db8-79a8-405f-b69b-00344f1bd3b2") : secret "node-exporter-tls" not found Apr 16 23:28:41.880037 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.879907 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52c47db8-79a8-405f-b69b-00344f1bd3b2-sys\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.881417 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.880201 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9588a074-c48c-4252-990c-1585ba91f39e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7mr58\" (UID: \"9588a074-c48c-4252-990c-1585ba91f39e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" Apr 16 23:28:41.881417 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.880277 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-wtmp\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.881417 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:41.880313 2564 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 23:28:41.881417 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:41.880378 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9588a074-c48c-4252-990c-1585ba91f39e-openshift-state-metrics-tls podName:9588a074-c48c-4252-990c-1585ba91f39e nodeName:}" failed. No retries permitted until 2026-04-16 23:28:42.380361101 +0000 UTC m=+162.996392563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/9588a074-c48c-4252-990c-1585ba91f39e-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-7mr58" (UID: "9588a074-c48c-4252-990c-1585ba91f39e") : secret "openshift-state-metrics-tls" not found Apr 16 23:28:41.881417 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.880897 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-textfile\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.881417 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.881017 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-accelerators-collector-config\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.881417 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.881375 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/52c47db8-79a8-405f-b69b-00344f1bd3b2-metrics-client-ca\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.882312 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.882285 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.882421 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.882365 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9588a074-c48c-4252-990c-1585ba91f39e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7mr58\" (UID: \"9588a074-c48c-4252-990c-1585ba91f39e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" Apr 16 23:28:41.889332 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.889307 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtpvp\" (UniqueName: \"kubernetes.io/projected/52c47db8-79a8-405f-b69b-00344f1bd3b2-kube-api-access-wtpvp\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:41.889757 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.889659 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5dk2\" (UniqueName: \"kubernetes.io/projected/9588a074-c48c-4252-990c-1585ba91f39e-kube-api-access-g5dk2\") pod \"openshift-state-metrics-9d44df66c-7mr58\" (UID: \"9588a074-c48c-4252-990c-1585ba91f39e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" Apr 16 23:28:41.980446 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.980409 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d9ce457-6a1e-4495-8955-15be5d126952-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.980630 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.980477 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82l86\" (UniqueName: \"kubernetes.io/projected/5d9ce457-6a1e-4495-8955-15be5d126952-kube-api-access-82l86\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.980630 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.980542 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d9ce457-6a1e-4495-8955-15be5d126952-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.980630 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.980576 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/5d9ce457-6a1e-4495-8955-15be5d126952-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.980630 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.980622 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5d9ce457-6a1e-4495-8955-15be5d126952-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.980853 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.980654 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/5d9ce457-6a1e-4495-8955-15be5d126952-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.981316 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.981287 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d9ce457-6a1e-4495-8955-15be5d126952-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.981575 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.981551 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/5d9ce457-6a1e-4495-8955-15be5d126952-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.981895 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.981877 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/5d9ce457-6a1e-4495-8955-15be5d126952-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.983320 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.983294 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d9ce457-6a1e-4495-8955-15be5d126952-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.983474 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.983458 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5d9ce457-6a1e-4495-8955-15be5d126952-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:41.988864 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:41.988839 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82l86\" (UniqueName: \"kubernetes.io/projected/5d9ce457-6a1e-4495-8955-15be5d126952-kube-api-access-82l86\") pod \"kube-state-metrics-69db897b98-t99s8\" (UID: \"5d9ce457-6a1e-4495-8955-15be5d126952\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:42.005363 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:42.005339 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" Apr 16 23:28:42.384482 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:42.384393 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9588a074-c48c-4252-990c-1585ba91f39e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7mr58\" (UID: \"9588a074-c48c-4252-990c-1585ba91f39e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" Apr 16 23:28:42.384482 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:42.384475 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-tls\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:42.387312 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:42.387281 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/52c47db8-79a8-405f-b69b-00344f1bd3b2-node-exporter-tls\") pod \"node-exporter-f47hd\" (UID: \"52c47db8-79a8-405f-b69b-00344f1bd3b2\") " pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:42.387496 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:42.387471 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9588a074-c48c-4252-990c-1585ba91f39e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7mr58\" (UID: \"9588a074-c48c-4252-990c-1585ba91f39e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" Apr 16 23:28:42.584747 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:42.584717 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" Apr 16 23:28:42.593268 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:42.592918 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-f47hd" Apr 16 23:28:42.636325 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:42.636178 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-t99s8"] Apr 16 23:28:42.643307 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:28:42.642178 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d9ce457_6a1e_4495_8955_15be5d126952.slice/crio-38543500fb1cd3bcaae0f74ef04df1478ae55405787ff8ca617c98275fff197a WatchSource:0}: Error finding container 38543500fb1cd3bcaae0f74ef04df1478ae55405787ff8ca617c98275fff197a: Status 404 returned error can't find the container with id 38543500fb1cd3bcaae0f74ef04df1478ae55405787ff8ca617c98275fff197a Apr 16 23:28:42.744020 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:42.743883 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58"] Apr 16 23:28:42.747198 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:28:42.747163 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9588a074_c48c_4252_990c_1585ba91f39e.slice/crio-1ae22e6bbc606ef5ffc64b0974767f81a1680c62929792849f0d74ed7413eb8c WatchSource:0}: Error finding container 1ae22e6bbc606ef5ffc64b0974767f81a1680c62929792849f0d74ed7413eb8c: Status 404 returned error can't find the container with id 1ae22e6bbc606ef5ffc64b0974767f81a1680c62929792849f0d74ed7413eb8c Apr 16 23:28:43.543220 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:43.543171 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fr2wb" event={"ID":"9efb52c5-c96d-422a-8c15-e03f71fdd622","Type":"ContainerStarted","Data":"f4f02cee439be8f9fd6be5fd5c228f4b483e1205c0fd3086b0ee3faa5c0207cb"} Apr 16 23:28:43.543397 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:43.543223 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fr2wb" event={"ID":"9efb52c5-c96d-422a-8c15-e03f71fdd622","Type":"ContainerStarted","Data":"f1714a7c4f70dfdde8718c896f14ca4461d014eb872447f3e191d091376cb955"} Apr 16 23:28:43.543397 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:43.543330 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fr2wb" Apr 16 23:28:43.544964 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:43.544934 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" event={"ID":"5d9ce457-6a1e-4495-8955-15be5d126952","Type":"ContainerStarted","Data":"38543500fb1cd3bcaae0f74ef04df1478ae55405787ff8ca617c98275fff197a"} Apr 16 23:28:43.546989 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:43.546964 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" event={"ID":"9588a074-c48c-4252-990c-1585ba91f39e","Type":"ContainerStarted","Data":"d96669c1a542371ce7b1d470939d404e461b81a1131d99b0084b3a529cc5c41d"} Apr 16 23:28:43.547109 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:43.546995 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" event={"ID":"9588a074-c48c-4252-990c-1585ba91f39e","Type":"ContainerStarted","Data":"b9d4730fc56b1cdd53ae2aa84efae51cfba255396ad1838516b4f7918b96c035"} Apr 16 23:28:43.547109 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:43.547012 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" event={"ID":"9588a074-c48c-4252-990c-1585ba91f39e","Type":"ContainerStarted","Data":"1ae22e6bbc606ef5ffc64b0974767f81a1680c62929792849f0d74ed7413eb8c"} Apr 16 23:28:43.548636 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:43.548612 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cwjtr" event={"ID":"3dc4e703-91ac-44ae-9d1a-83214f2378fd","Type":"ContainerStarted","Data":"e4f3132b7f3ecdefb6e1fe09beb43a5945dc2f8dbb56a03aea89cd26785acaf5"} Apr 16 23:28:43.550383 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:43.550359 2564 generic.go:358] "Generic (PLEG): container finished" podID="52c47db8-79a8-405f-b69b-00344f1bd3b2" containerID="9fb472958637603344231e317ae35af78cee0e4ceda4d29635ad8e31e372d7b8" exitCode=0 Apr 16 23:28:43.550491 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:43.550411 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f47hd" event={"ID":"52c47db8-79a8-405f-b69b-00344f1bd3b2","Type":"ContainerDied","Data":"9fb472958637603344231e317ae35af78cee0e4ceda4d29635ad8e31e372d7b8"} Apr 16 23:28:43.550491 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:43.550436 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f47hd" event={"ID":"52c47db8-79a8-405f-b69b-00344f1bd3b2","Type":"ContainerStarted","Data":"7f998ddefc8fe2b750a6224946355522031313249b3dfc10992528fbf6ca17c6"} Apr 16 23:28:43.558732 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:43.558643 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fr2wb" podStartSLOduration=129.52139512 podStartE2EDuration="2m11.558629506s" podCreationTimestamp="2026-04-16 23:26:32 +0000 UTC" firstStartedPulling="2026-04-16 23:28:40.457509382 +0000 UTC m=+161.073540839" lastFinishedPulling="2026-04-16 23:28:42.494743753 +0000 UTC m=+163.110775225" observedRunningTime="2026-04-16 23:28:43.557326936 +0000 UTC m=+164.173358428" watchObservedRunningTime="2026-04-16 23:28:43.558629506 +0000 UTC m=+164.174660984" Apr 16 23:28:43.574563 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:43.574508 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cwjtr" podStartSLOduration=129.546296266 podStartE2EDuration="2m11.574490134s" podCreationTimestamp="2026-04-16 23:26:32 +0000 UTC" firstStartedPulling="2026-04-16 23:28:40.472490374 +0000 UTC m=+161.088521831" lastFinishedPulling="2026-04-16 23:28:42.500684224 +0000 UTC m=+163.116715699" observedRunningTime="2026-04-16 23:28:43.572857941 +0000 UTC m=+164.188889417" watchObservedRunningTime="2026-04-16 23:28:43.574490134 +0000 UTC m=+164.190521615" Apr 16 23:28:43.905284 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:43.905195 2564 patch_prober.go:28] interesting pod/image-registry-79f5857d9-2hkvf container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 23:28:43.905284 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:43.905250 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" podUID="f6ba0269-a448-409c-a5ac-b14dea6a67bf" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:28:44.555570 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:44.555468 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" event={"ID":"5d9ce457-6a1e-4495-8955-15be5d126952","Type":"ContainerStarted","Data":"2c863675d18178af1d0d11571a97a653eb39a1bc42adb9ca4916e93b6b980625"} Apr 16 23:28:44.555570 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:44.555529 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" event={"ID":"5d9ce457-6a1e-4495-8955-15be5d126952","Type":"ContainerStarted","Data":"a6bf5ecfcb6b808a8a2a333f1ce013247c8ad338ee902b3e415746966b0be16a"} Apr 16 23:28:44.555570 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:44.555544 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" event={"ID":"5d9ce457-6a1e-4495-8955-15be5d126952","Type":"ContainerStarted","Data":"a9062e9484e10db8e43e7c89c9d38aadd24ef791a10869ceed07bee63f95000d"} Apr 16 23:28:44.558246 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:44.557764 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" event={"ID":"9588a074-c48c-4252-990c-1585ba91f39e","Type":"ContainerStarted","Data":"7aacc4f28f90c3548545806971c9187649e09f4407e8543bb6249c5fdd1b8a7d"} Apr 16 23:28:44.565617 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:44.565583 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f47hd" event={"ID":"52c47db8-79a8-405f-b69b-00344f1bd3b2","Type":"ContainerStarted","Data":"b2dac85ba5cbc06edd2c13bb57cc0e29512e71db6a7a118f4159d371cc5753f3"} Apr 16 23:28:44.565779 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:44.565627 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f47hd" event={"ID":"52c47db8-79a8-405f-b69b-00344f1bd3b2","Type":"ContainerStarted","Data":"77ef7c70a9a693b3931e1af9d95c9d265961f7b213d999a816a8bfbc23609df9"} Apr 16 23:28:44.579340 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:44.579282 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-t99s8" podStartSLOduration=1.988849075 podStartE2EDuration="3.57926446s" podCreationTimestamp="2026-04-16 23:28:41 +0000 UTC" firstStartedPulling="2026-04-16 23:28:42.645829557 +0000 UTC m=+163.261861014" lastFinishedPulling="2026-04-16 23:28:44.236244928 +0000 UTC m=+164.852276399" observedRunningTime="2026-04-16 23:28:44.57857502 +0000 UTC m=+165.194606502" watchObservedRunningTime="2026-04-16 23:28:44.57926446 +0000 UTC m=+165.195295921" Apr 16 23:28:44.596816 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:44.596657 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-f47hd" podStartSLOduration=2.9015192990000003 podStartE2EDuration="3.596640943s" podCreationTimestamp="2026-04-16 23:28:41 +0000 UTC" firstStartedPulling="2026-04-16 23:28:42.612064935 +0000 UTC m=+163.228096393" lastFinishedPulling="2026-04-16 23:28:43.307186565 +0000 UTC m=+163.923218037" observedRunningTime="2026-04-16 23:28:44.596405016 +0000 UTC m=+165.212436508" watchObservedRunningTime="2026-04-16 23:28:44.596640943 +0000 UTC m=+165.212672423" Apr 16 23:28:44.611769 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:44.611684 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mr58" podStartSLOduration=2.242970173 podStartE2EDuration="3.611661596s" podCreationTimestamp="2026-04-16 23:28:41 +0000 UTC" firstStartedPulling="2026-04-16 23:28:42.869256973 +0000 UTC m=+163.485288429" lastFinishedPulling="2026-04-16 23:28:44.237948394 +0000 UTC m=+164.853979852" observedRunningTime="2026-04-16 23:28:44.61138079 +0000 UTC m=+165.227412271" watchObservedRunningTime="2026-04-16 23:28:44.611661596 +0000 UTC m=+165.227693078" Apr 16 23:28:45.973681 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:45.973644 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6db978cd86-8m9t5"] Apr 16 23:28:45.976825 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:45.976807 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:45.978963 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:45.978935 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 23:28:45.979058 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:45.978935 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 23:28:45.979614 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:45.979594 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 23:28:45.979778 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:45.979759 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-bgf5biviqi2vf\"" Apr 16 23:28:45.979866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:45.979786 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 23:28:45.979866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:45.979827 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-bwcl9\"" Apr 16 23:28:45.986309 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:45.986279 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6db978cd86-8m9t5"] Apr 16 23:28:46.018351 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.018315 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97pvs\" (UniqueName: \"kubernetes.io/projected/b68074bf-4c69-4374-ad15-d584ea87107b-kube-api-access-97pvs\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.018547 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.018362 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b68074bf-4c69-4374-ad15-d584ea87107b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.018547 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.018398 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/b68074bf-4c69-4374-ad15-d584ea87107b-secret-metrics-server-tls\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.018547 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.018424 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68074bf-4c69-4374-ad15-d584ea87107b-client-ca-bundle\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.018547 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.018443 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/b68074bf-4c69-4374-ad15-d584ea87107b-audit-log\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.018547 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.018461 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/b68074bf-4c69-4374-ad15-d584ea87107b-metrics-server-audit-profiles\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.018547 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.018534 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/b68074bf-4c69-4374-ad15-d584ea87107b-secret-metrics-server-client-certs\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.119565 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.119528 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97pvs\" (UniqueName: \"kubernetes.io/projected/b68074bf-4c69-4374-ad15-d584ea87107b-kube-api-access-97pvs\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.119671 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.119580 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b68074bf-4c69-4374-ad15-d584ea87107b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.119759 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.119729 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/b68074bf-4c69-4374-ad15-d584ea87107b-secret-metrics-server-tls\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.119815 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.119798 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68074bf-4c69-4374-ad15-d584ea87107b-client-ca-bundle\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.119846 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.119838 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/b68074bf-4c69-4374-ad15-d584ea87107b-audit-log\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.119888 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.119870 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/b68074bf-4c69-4374-ad15-d584ea87107b-metrics-server-audit-profiles\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.119936 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.119907 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/b68074bf-4c69-4374-ad15-d584ea87107b-secret-metrics-server-client-certs\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.120232 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.120213 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/b68074bf-4c69-4374-ad15-d584ea87107b-audit-log\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.120337 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.120250 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b68074bf-4c69-4374-ad15-d584ea87107b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.120641 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.120623 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/b68074bf-4c69-4374-ad15-d584ea87107b-metrics-server-audit-profiles\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.122207 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.122189 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68074bf-4c69-4374-ad15-d584ea87107b-client-ca-bundle\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.122453 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.122433 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/b68074bf-4c69-4374-ad15-d584ea87107b-secret-metrics-server-tls\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.122489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.122463 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/b68074bf-4c69-4374-ad15-d584ea87107b-secret-metrics-server-client-certs\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.127216 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.127194 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97pvs\" (UniqueName: \"kubernetes.io/projected/b68074bf-4c69-4374-ad15-d584ea87107b-kube-api-access-97pvs\") pod \"metrics-server-6db978cd86-8m9t5\" (UID: \"b68074bf-4c69-4374-ad15-d584ea87107b\") " pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.287447 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.287350 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:28:46.405915 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.405849 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6db978cd86-8m9t5"] Apr 16 23:28:46.409214 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:28:46.409182 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb68074bf_4c69_4374_ad15_d584ea87107b.slice/crio-8cc811f7bea5eee55aa6a44a303f808243f67229f0587d9c66551912324fca4c WatchSource:0}: Error finding container 8cc811f7bea5eee55aa6a44a303f808243f67229f0587d9c66551912324fca4c: Status 404 returned error can't find the container with id 8cc811f7bea5eee55aa6a44a303f808243f67229f0587d9c66551912324fca4c Apr 16 23:28:46.571460 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.571346 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" event={"ID":"b68074bf-4c69-4374-ad15-d584ea87107b","Type":"ContainerStarted","Data":"8cc811f7bea5eee55aa6a44a303f808243f67229f0587d9c66551912324fca4c"} Apr 16 23:28:46.866234 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.866154 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-89fc5cd9-nnkdg"] Apr 16 23:28:46.871191 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.871174 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:46.873159 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.873138 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 23:28:46.873263 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.873174 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 23:28:46.873326 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.873312 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 23:28:46.873383 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.873339 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 23:28:46.874015 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.873996 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-6xtw8\"" Apr 16 23:28:46.874165 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.874146 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 23:28:46.877535 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.877514 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 23:28:46.881176 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.881154 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-89fc5cd9-nnkdg"] Apr 16 23:28:46.928287 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.928253 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccgvj\" (UniqueName: \"kubernetes.io/projected/a160bce0-fcf3-4730-bdeb-d185a0828bd4-kube-api-access-ccgvj\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:46.928287 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.928287 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a160bce0-fcf3-4730-bdeb-d185a0828bd4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:46.928506 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.928315 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a160bce0-fcf3-4730-bdeb-d185a0828bd4-telemeter-client-tls\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:46.928506 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.928338 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a160bce0-fcf3-4730-bdeb-d185a0828bd4-metrics-client-ca\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:46.928506 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.928380 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a160bce0-fcf3-4730-bdeb-d185a0828bd4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:46.928506 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.928397 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a160bce0-fcf3-4730-bdeb-d185a0828bd4-serving-certs-ca-bundle\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:46.928506 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.928481 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a160bce0-fcf3-4730-bdeb-d185a0828bd4-federate-client-tls\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:46.928683 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:46.928517 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a160bce0-fcf3-4730-bdeb-d185a0828bd4-secret-telemeter-client\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.029148 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.029112 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a160bce0-fcf3-4730-bdeb-d185a0828bd4-federate-client-tls\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.029611 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.029158 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a160bce0-fcf3-4730-bdeb-d185a0828bd4-secret-telemeter-client\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.029611 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.029188 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccgvj\" (UniqueName: \"kubernetes.io/projected/a160bce0-fcf3-4730-bdeb-d185a0828bd4-kube-api-access-ccgvj\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.029611 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.029217 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a160bce0-fcf3-4730-bdeb-d185a0828bd4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.029611 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.029257 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a160bce0-fcf3-4730-bdeb-d185a0828bd4-telemeter-client-tls\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.029868 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.029744 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a160bce0-fcf3-4730-bdeb-d185a0828bd4-metrics-client-ca\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.029868 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.029793 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a160bce0-fcf3-4730-bdeb-d185a0828bd4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.029868 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.029826 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a160bce0-fcf3-4730-bdeb-d185a0828bd4-serving-certs-ca-bundle\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.030311 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.030284 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a160bce0-fcf3-4730-bdeb-d185a0828bd4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.030562 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.030512 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a160bce0-fcf3-4730-bdeb-d185a0828bd4-metrics-client-ca\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.030562 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.030521 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a160bce0-fcf3-4730-bdeb-d185a0828bd4-serving-certs-ca-bundle\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.032296 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.032273 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a160bce0-fcf3-4730-bdeb-d185a0828bd4-secret-telemeter-client\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.032408 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.032387 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a160bce0-fcf3-4730-bdeb-d185a0828bd4-telemeter-client-tls\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.032515 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.032485 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a160bce0-fcf3-4730-bdeb-d185a0828bd4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.032886 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.032861 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a160bce0-fcf3-4730-bdeb-d185a0828bd4-federate-client-tls\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.036611 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.036590 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccgvj\" (UniqueName: \"kubernetes.io/projected/a160bce0-fcf3-4730-bdeb-d185a0828bd4-kube-api-access-ccgvj\") pod \"telemeter-client-89fc5cd9-nnkdg\" (UID: \"a160bce0-fcf3-4730-bdeb-d185a0828bd4\") " pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.180816 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.180726 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" Apr 16 23:28:47.318333 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.318304 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-89fc5cd9-nnkdg"] Apr 16 23:28:47.322239 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:28:47.322207 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda160bce0_fcf3_4730_bdeb_d185a0828bd4.slice/crio-7536800bbcaadab4f65c124259337cb701bf73b2921c17dcec5a9e5a41646499 WatchSource:0}: Error finding container 7536800bbcaadab4f65c124259337cb701bf73b2921c17dcec5a9e5a41646499: Status 404 returned error can't find the container with id 7536800bbcaadab4f65c124259337cb701bf73b2921c17dcec5a9e5a41646499 Apr 16 23:28:47.577227 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:47.577183 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" event={"ID":"a160bce0-fcf3-4730-bdeb-d185a0828bd4","Type":"ContainerStarted","Data":"7536800bbcaadab4f65c124259337cb701bf73b2921c17dcec5a9e5a41646499"} Apr 16 23:28:48.581328 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:48.581290 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" event={"ID":"b68074bf-4c69-4374-ad15-d584ea87107b","Type":"ContainerStarted","Data":"39a2ef0cf7597e52bbad20f0db2794a2002ba19959a606b17912c545072cc5aa"} Apr 16 23:28:48.596920 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:48.596866 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" podStartSLOduration=2.09318523 podStartE2EDuration="3.596851869s" podCreationTimestamp="2026-04-16 23:28:45 +0000 UTC" firstStartedPulling="2026-04-16 23:28:46.411816039 +0000 UTC m=+167.027847496" lastFinishedPulling="2026-04-16 23:28:47.915482675 +0000 UTC m=+168.531514135" observedRunningTime="2026-04-16 23:28:48.596342347 +0000 UTC m=+169.212373827" watchObservedRunningTime="2026-04-16 23:28:48.596851869 +0000 UTC m=+169.212883350" Apr 16 23:28:49.585898 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:49.585860 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" event={"ID":"a160bce0-fcf3-4730-bdeb-d185a0828bd4","Type":"ContainerStarted","Data":"6213a8183b6e37f7006443ae9bba6249c8d0271fcdc136cf6c8adaa200b184b9"} Apr 16 23:28:50.589954 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:50.589875 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" event={"ID":"a160bce0-fcf3-4730-bdeb-d185a0828bd4","Type":"ContainerStarted","Data":"b4334aeda16648d8498a9aa0bfff5ca25d4fd303e6530829705772a700d1ba4e"} Apr 16 23:28:50.589954 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:50.589910 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" event={"ID":"a160bce0-fcf3-4730-bdeb-d185a0828bd4","Type":"ContainerStarted","Data":"ff55142bcad1b12f6f0ccb419ed0d2fdbec36c196cae1e53cc32205b8d39092c"} Apr 16 23:28:50.610515 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:50.610464 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-89fc5cd9-nnkdg" podStartSLOduration=1.607371329 podStartE2EDuration="4.61044859s" podCreationTimestamp="2026-04-16 23:28:46 +0000 UTC" firstStartedPulling="2026-04-16 23:28:47.324639206 +0000 UTC m=+167.940670678" lastFinishedPulling="2026-04-16 23:28:50.327716474 +0000 UTC m=+170.943747939" observedRunningTime="2026-04-16 23:28:50.609306717 +0000 UTC m=+171.225338196" watchObservedRunningTime="2026-04-16 23:28:50.61044859 +0000 UTC m=+171.226480108" Apr 16 23:28:51.977888 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:51.977846 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:28:53.568332 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:53.568303 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fr2wb" Apr 16 23:28:53.904361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:53.904279 2564 patch_prober.go:28] interesting pod/image-registry-79f5857d9-2hkvf container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 23:28:53.904361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:53.904333 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" podUID="f6ba0269-a448-409c-a5ac-b14dea6a67bf" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:28:53.978292 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:53.978261 2564 scope.go:117] "RemoveContainer" containerID="4e54f717c73a180f66f1634a5620d34eb39f8fe11ba4d0b3f8e4502917730d1f" Apr 16 23:28:54.606152 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:54.606125 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 16 23:28:54.606608 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:54.606189 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" event={"ID":"ed5b2be5-9e2a-419f-989a-30f08a0e3d57","Type":"ContainerStarted","Data":"42efa745d3c1d0a4f1a56eee34b219cfa22a5ba7d3751795d8e8cbe17597b9a7"} Apr 16 23:28:54.606608 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:54.606469 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:54.621052 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:54.620998 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" podStartSLOduration=44.109048334 podStartE2EDuration="46.620978167s" podCreationTimestamp="2026-04-16 23:28:08 +0000 UTC" firstStartedPulling="2026-04-16 23:28:09.29456659 +0000 UTC m=+129.910598060" lastFinishedPulling="2026-04-16 23:28:11.806496422 +0000 UTC m=+132.422527893" observedRunningTime="2026-04-16 23:28:54.620373271 +0000 UTC m=+175.236404789" watchObservedRunningTime="2026-04-16 23:28:54.620978167 +0000 UTC m=+175.237009648" Apr 16 23:28:54.628769 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:54.628746 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-8ldlc" Apr 16 23:28:54.806033 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:54.806004 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-lq92s"] Apr 16 23:28:54.809252 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:54.809232 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lq92s" Apr 16 23:28:54.811478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:54.811456 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-k5xwh\"" Apr 16 23:28:54.811565 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:54.811457 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 23:28:54.811565 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:54.811457 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 23:28:54.816383 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:54.816364 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lq92s"] Apr 16 23:28:54.905898 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:54.905803 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bft4d\" (UniqueName: \"kubernetes.io/projected/9d0463f5-8d66-479e-a0dc-a13347a782d2-kube-api-access-bft4d\") pod \"downloads-6bcc868b7-lq92s\" (UID: \"9d0463f5-8d66-479e-a0dc-a13347a782d2\") " pod="openshift-console/downloads-6bcc868b7-lq92s" Apr 16 23:28:55.007171 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:55.007123 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bft4d\" (UniqueName: \"kubernetes.io/projected/9d0463f5-8d66-479e-a0dc-a13347a782d2-kube-api-access-bft4d\") pod \"downloads-6bcc868b7-lq92s\" (UID: \"9d0463f5-8d66-479e-a0dc-a13347a782d2\") " pod="openshift-console/downloads-6bcc868b7-lq92s" Apr 16 23:28:55.014263 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:55.014233 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bft4d\" (UniqueName: \"kubernetes.io/projected/9d0463f5-8d66-479e-a0dc-a13347a782d2-kube-api-access-bft4d\") pod \"downloads-6bcc868b7-lq92s\" (UID: \"9d0463f5-8d66-479e-a0dc-a13347a782d2\") " pod="openshift-console/downloads-6bcc868b7-lq92s" Apr 16 23:28:55.119073 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:55.119046 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lq92s" Apr 16 23:28:55.257015 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:55.256980 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lq92s"] Apr 16 23:28:55.260637 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:28:55.260610 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d0463f5_8d66_479e_a0dc_a13347a782d2.slice/crio-8a8a615ba565445359eab4a47b57659b21059dca4cd5e930792667c56c772d23 WatchSource:0}: Error finding container 8a8a615ba565445359eab4a47b57659b21059dca4cd5e930792667c56c772d23: Status 404 returned error can't find the container with id 8a8a615ba565445359eab4a47b57659b21059dca4cd5e930792667c56c772d23 Apr 16 23:28:55.609869 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:55.609829 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lq92s" event={"ID":"9d0463f5-8d66-479e-a0dc-a13347a782d2","Type":"ContainerStarted","Data":"8a8a615ba565445359eab4a47b57659b21059dca4cd5e930792667c56c772d23"} Apr 16 23:28:58.918355 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:58.918309 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" podUID="f6ba0269-a448-409c-a5ac-b14dea6a67bf" containerName="registry" containerID="cri-o://b9f7d3a2dae83929970388e85783cdd8822fc112e47fb7d6b93a7b05866dcf49" gracePeriod=30 Apr 16 23:28:59.178167 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.178101 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:59.248053 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.248018 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6ba0269-a448-409c-a5ac-b14dea6a67bf-trusted-ca\") pod \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " Apr 16 23:28:59.248214 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.248067 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls\") pod \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " Apr 16 23:28:59.248214 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.248109 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-bound-sa-token\") pod \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " Apr 16 23:28:59.248337 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.248224 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6ba0269-a448-409c-a5ac-b14dea6a67bf-ca-trust-extracted\") pod \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " Apr 16 23:28:59.248337 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.248280 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6ba0269-a448-409c-a5ac-b14dea6a67bf-installation-pull-secrets\") pod \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " Apr 16 23:28:59.248441 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.248337 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkvxg\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-kube-api-access-tkvxg\") pod \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " Apr 16 23:28:59.248441 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.248378 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-certificates\") pod \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " Apr 16 23:28:59.248441 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.248409 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f6ba0269-a448-409c-a5ac-b14dea6a67bf-image-registry-private-configuration\") pod \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\" (UID: \"f6ba0269-a448-409c-a5ac-b14dea6a67bf\") " Apr 16 23:28:59.248582 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.248534 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6ba0269-a448-409c-a5ac-b14dea6a67bf-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f6ba0269-a448-409c-a5ac-b14dea6a67bf" (UID: "f6ba0269-a448-409c-a5ac-b14dea6a67bf"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:28:59.248963 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.248921 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6ba0269-a448-409c-a5ac-b14dea6a67bf-trusted-ca\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:28:59.248963 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.248937 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f6ba0269-a448-409c-a5ac-b14dea6a67bf" (UID: "f6ba0269-a448-409c-a5ac-b14dea6a67bf"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:28:59.251166 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.251117 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ba0269-a448-409c-a5ac-b14dea6a67bf-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f6ba0269-a448-409c-a5ac-b14dea6a67bf" (UID: "f6ba0269-a448-409c-a5ac-b14dea6a67bf"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:28:59.251453 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.251358 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f6ba0269-a448-409c-a5ac-b14dea6a67bf" (UID: "f6ba0269-a448-409c-a5ac-b14dea6a67bf"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:28:59.251453 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.251422 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-kube-api-access-tkvxg" (OuterVolumeSpecName: "kube-api-access-tkvxg") pod "f6ba0269-a448-409c-a5ac-b14dea6a67bf" (UID: "f6ba0269-a448-409c-a5ac-b14dea6a67bf"). InnerVolumeSpecName "kube-api-access-tkvxg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:28:59.251590 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.251490 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f6ba0269-a448-409c-a5ac-b14dea6a67bf" (UID: "f6ba0269-a448-409c-a5ac-b14dea6a67bf"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:28:59.251646 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.251634 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ba0269-a448-409c-a5ac-b14dea6a67bf-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f6ba0269-a448-409c-a5ac-b14dea6a67bf" (UID: "f6ba0269-a448-409c-a5ac-b14dea6a67bf"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:28:59.259020 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.258991 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6ba0269-a448-409c-a5ac-b14dea6a67bf-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f6ba0269-a448-409c-a5ac-b14dea6a67bf" (UID: "f6ba0269-a448-409c-a5ac-b14dea6a67bf"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:28:59.349943 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.349910 2564 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-tls\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:28:59.349943 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.349946 2564 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-bound-sa-token\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:28:59.350147 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.349961 2564 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6ba0269-a448-409c-a5ac-b14dea6a67bf-ca-trust-extracted\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:28:59.350147 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.349978 2564 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6ba0269-a448-409c-a5ac-b14dea6a67bf-installation-pull-secrets\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:28:59.350147 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.349994 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tkvxg\" (UniqueName: \"kubernetes.io/projected/f6ba0269-a448-409c-a5ac-b14dea6a67bf-kube-api-access-tkvxg\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:28:59.350147 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.350008 2564 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6ba0269-a448-409c-a5ac-b14dea6a67bf-registry-certificates\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:28:59.350147 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.350024 2564 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f6ba0269-a448-409c-a5ac-b14dea6a67bf-image-registry-private-configuration\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:28:59.625899 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.625865 2564 generic.go:358] "Generic (PLEG): container finished" podID="f6ba0269-a448-409c-a5ac-b14dea6a67bf" containerID="b9f7d3a2dae83929970388e85783cdd8822fc112e47fb7d6b93a7b05866dcf49" exitCode=0 Apr 16 23:28:59.626106 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.625964 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" Apr 16 23:28:59.626106 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.625963 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" event={"ID":"f6ba0269-a448-409c-a5ac-b14dea6a67bf","Type":"ContainerDied","Data":"b9f7d3a2dae83929970388e85783cdd8822fc112e47fb7d6b93a7b05866dcf49"} Apr 16 23:28:59.626106 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.626083 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79f5857d9-2hkvf" event={"ID":"f6ba0269-a448-409c-a5ac-b14dea6a67bf","Type":"ContainerDied","Data":"b1b40809ef2629821184fd92f26689ad836fb5d1948523547647662e11ff184b"} Apr 16 23:28:59.626289 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.626106 2564 scope.go:117] "RemoveContainer" containerID="b9f7d3a2dae83929970388e85783cdd8822fc112e47fb7d6b93a7b05866dcf49" Apr 16 23:28:59.635391 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.635336 2564 scope.go:117] "RemoveContainer" containerID="b9f7d3a2dae83929970388e85783cdd8822fc112e47fb7d6b93a7b05866dcf49" Apr 16 23:28:59.635678 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:28:59.635637 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f7d3a2dae83929970388e85783cdd8822fc112e47fb7d6b93a7b05866dcf49\": container with ID starting with b9f7d3a2dae83929970388e85783cdd8822fc112e47fb7d6b93a7b05866dcf49 not found: ID does not exist" containerID="b9f7d3a2dae83929970388e85783cdd8822fc112e47fb7d6b93a7b05866dcf49" Apr 16 23:28:59.635804 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.635674 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f7d3a2dae83929970388e85783cdd8822fc112e47fb7d6b93a7b05866dcf49"} err="failed to get container status \"b9f7d3a2dae83929970388e85783cdd8822fc112e47fb7d6b93a7b05866dcf49\": rpc error: code = NotFound desc = could not find container \"b9f7d3a2dae83929970388e85783cdd8822fc112e47fb7d6b93a7b05866dcf49\": container with ID starting with b9f7d3a2dae83929970388e85783cdd8822fc112e47fb7d6b93a7b05866dcf49 not found: ID does not exist" Apr 16 23:28:59.648489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.648461 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-79f5857d9-2hkvf"] Apr 16 23:28:59.651716 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.651674 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-79f5857d9-2hkvf"] Apr 16 23:28:59.982293 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:28:59.982258 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6ba0269-a448-409c-a5ac-b14dea6a67bf" path="/var/lib/kubelet/pods/f6ba0269-a448-409c-a5ac-b14dea6a67bf/volumes" Apr 16 23:29:00.132226 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.132096 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-bf9d7b449-82fvm"] Apr 16 23:29:00.132536 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.132517 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6ba0269-a448-409c-a5ac-b14dea6a67bf" containerName="registry" Apr 16 23:29:00.132627 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.132539 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ba0269-a448-409c-a5ac-b14dea6a67bf" containerName="registry" Apr 16 23:29:00.132679 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.132634 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6ba0269-a448-409c-a5ac-b14dea6a67bf" containerName="registry" Apr 16 23:29:00.138353 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.138325 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.140731 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.140685 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 23:29:00.140897 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.140735 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 23:29:00.140897 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.140786 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 23:29:00.140897 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.140737 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 23:29:00.140897 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.140862 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 23:29:00.141144 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.141081 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jz99f\"" Apr 16 23:29:00.144919 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.144898 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bf9d7b449-82fvm"] Apr 16 23:29:00.258579 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.258545 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-oauth-serving-cert\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.258786 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.258586 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-serving-cert\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.258786 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.258604 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-oauth-config\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.258786 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.258658 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-477hd\" (UniqueName: \"kubernetes.io/projected/9d861541-9a91-4298-bac7-0ca08bf9f31b-kube-api-access-477hd\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.258786 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.258730 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-service-ca\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.258953 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.258823 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-config\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.360097 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.360058 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-config\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.360256 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.360165 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-oauth-serving-cert\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.360256 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.360200 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-serving-cert\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.360256 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.360225 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-oauth-config\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.360256 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.360248 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-477hd\" (UniqueName: \"kubernetes.io/projected/9d861541-9a91-4298-bac7-0ca08bf9f31b-kube-api-access-477hd\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.360495 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.360465 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-service-ca\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.360957 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.360923 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-config\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.361049 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.360964 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-oauth-serving-cert\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.361166 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.361144 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-service-ca\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.362980 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.362957 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-oauth-config\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.363084 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.362968 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-serving-cert\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.368529 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.368507 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-477hd\" (UniqueName: \"kubernetes.io/projected/9d861541-9a91-4298-bac7-0ca08bf9f31b-kube-api-access-477hd\") pod \"console-bf9d7b449-82fvm\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.451102 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.451021 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:00.589486 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.589452 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bf9d7b449-82fvm"] Apr 16 23:29:00.592539 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:29:00.592510 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d861541_9a91_4298_bac7_0ca08bf9f31b.slice/crio-4e55024354ac3fd7a3470cda673f7477484870ee511ebd8b2d0f5b564aa0a760 WatchSource:0}: Error finding container 4e55024354ac3fd7a3470cda673f7477484870ee511ebd8b2d0f5b564aa0a760: Status 404 returned error can't find the container with id 4e55024354ac3fd7a3470cda673f7477484870ee511ebd8b2d0f5b564aa0a760 Apr 16 23:29:00.630567 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:00.630535 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bf9d7b449-82fvm" event={"ID":"9d861541-9a91-4298-bac7-0ca08bf9f31b","Type":"ContainerStarted","Data":"4e55024354ac3fd7a3470cda673f7477484870ee511ebd8b2d0f5b564aa0a760"} Apr 16 23:29:04.645943 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:04.645890 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bf9d7b449-82fvm" event={"ID":"9d861541-9a91-4298-bac7-0ca08bf9f31b","Type":"ContainerStarted","Data":"afe6521c3917ac09e03f6b8820b43c230fed91b79dcbb1884d0a32f3f6d51a3e"} Apr 16 23:29:04.662920 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:04.662863 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bf9d7b449-82fvm" podStartSLOduration=1.623237096 podStartE2EDuration="4.662846679s" podCreationTimestamp="2026-04-16 23:29:00 +0000 UTC" firstStartedPulling="2026-04-16 23:29:00.594791215 +0000 UTC m=+181.210822672" lastFinishedPulling="2026-04-16 23:29:03.634400795 +0000 UTC m=+184.250432255" observedRunningTime="2026-04-16 23:29:04.662206543 +0000 UTC m=+185.278238021" watchObservedRunningTime="2026-04-16 23:29:04.662846679 +0000 UTC m=+185.278878158" Apr 16 23:29:06.288363 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:06.288323 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:29:06.288862 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:06.288378 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:29:08.510063 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.510028 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5dbd66f789-d4wvd"] Apr 16 23:29:08.513390 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.513368 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.520685 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.520213 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 23:29:08.522932 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.522885 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dbd66f789-d4wvd"] Apr 16 23:29:08.640808 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.640764 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-serving-cert\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.640996 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.640820 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-oauth-config\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.640996 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.640857 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-service-ca\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.640996 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.640895 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-trusted-ca-bundle\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.640996 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.640957 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-config\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.640996 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.640975 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmbrw\" (UniqueName: \"kubernetes.io/projected/96951181-f52d-4ace-9e00-53ed27d2e6cc-kube-api-access-rmbrw\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.641253 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.641099 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-oauth-serving-cert\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.741661 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.741618 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-service-ca\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.741868 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.741670 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-trusted-ca-bundle\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.741954 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.741871 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-config\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.741954 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.741919 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmbrw\" (UniqueName: \"kubernetes.io/projected/96951181-f52d-4ace-9e00-53ed27d2e6cc-kube-api-access-rmbrw\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.742072 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.741992 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-oauth-serving-cert\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.742128 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.742070 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-serving-cert\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.742128 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.742102 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-oauth-config\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.742437 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.742408 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-service-ca\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.742556 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.742535 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-trusted-ca-bundle\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.742616 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.742567 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-config\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.742890 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.742864 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-oauth-serving-cert\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.744821 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.744794 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-oauth-config\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.745021 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.745003 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-serving-cert\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.749251 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.749230 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmbrw\" (UniqueName: \"kubernetes.io/projected/96951181-f52d-4ace-9e00-53ed27d2e6cc-kube-api-access-rmbrw\") pod \"console-5dbd66f789-d4wvd\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:08.825054 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:08.824966 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:10.451818 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:10.451779 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:10.451818 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:10.451824 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:10.457525 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:10.457499 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:10.670900 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:10.670872 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:11.251668 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:11.251638 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dbd66f789-d4wvd"] Apr 16 23:29:11.254965 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:29:11.254933 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96951181_f52d_4ace_9e00_53ed27d2e6cc.slice/crio-4f8bad5d8533ead6368e9cc93e841f9a35c0520f5dbffac827fe751c3c4c3888 WatchSource:0}: Error finding container 4f8bad5d8533ead6368e9cc93e841f9a35c0520f5dbffac827fe751c3c4c3888: Status 404 returned error can't find the container with id 4f8bad5d8533ead6368e9cc93e841f9a35c0520f5dbffac827fe751c3c4c3888 Apr 16 23:29:11.671252 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:11.671210 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lq92s" event={"ID":"9d0463f5-8d66-479e-a0dc-a13347a782d2","Type":"ContainerStarted","Data":"be332232a3c9652cdd6de9bf150c1e926aef189bfa3ca5faf0f39ceebc95b358"} Apr 16 23:29:11.671747 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:11.671399 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-lq92s" Apr 16 23:29:11.673219 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:11.673189 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dbd66f789-d4wvd" event={"ID":"96951181-f52d-4ace-9e00-53ed27d2e6cc","Type":"ContainerStarted","Data":"c08f62b3f5543d37c082d02377eb4e1354206679a189fe0f2e725b89bf8c45af"} Apr 16 23:29:11.673332 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:11.673226 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dbd66f789-d4wvd" event={"ID":"96951181-f52d-4ace-9e00-53ed27d2e6cc","Type":"ContainerStarted","Data":"4f8bad5d8533ead6368e9cc93e841f9a35c0520f5dbffac827fe751c3c4c3888"} Apr 16 23:29:11.682536 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:11.682512 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-lq92s" Apr 16 23:29:11.688995 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:11.688939 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-lq92s" podStartSLOduration=1.717517688 podStartE2EDuration="17.688925589s" podCreationTimestamp="2026-04-16 23:28:54 +0000 UTC" firstStartedPulling="2026-04-16 23:28:55.262948715 +0000 UTC m=+175.878980172" lastFinishedPulling="2026-04-16 23:29:11.234356609 +0000 UTC m=+191.850388073" observedRunningTime="2026-04-16 23:29:11.687330287 +0000 UTC m=+192.303361767" watchObservedRunningTime="2026-04-16 23:29:11.688925589 +0000 UTC m=+192.304957067" Apr 16 23:29:11.703035 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:11.702988 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5dbd66f789-d4wvd" podStartSLOduration=3.702976466 podStartE2EDuration="3.702976466s" podCreationTimestamp="2026-04-16 23:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:29:11.702032234 +0000 UTC m=+192.318063737" watchObservedRunningTime="2026-04-16 23:29:11.702976466 +0000 UTC m=+192.319007945" Apr 16 23:29:18.825433 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:18.825396 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:18.826073 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:18.825567 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:18.830856 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:18.830830 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:19.704272 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:19.704245 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:29:19.759797 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:19.759761 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bf9d7b449-82fvm"] Apr 16 23:29:22.709492 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:22.709457 2564 generic.go:358] "Generic (PLEG): container finished" podID="2665ad6e-102c-40fd-8fac-4e7fdd52738a" containerID="c712793f9b6d56940b3d528cb78b2bef7c30d47ea250ee50a3db7dd585927f94" exitCode=0 Apr 16 23:29:22.709874 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:22.709534 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" event={"ID":"2665ad6e-102c-40fd-8fac-4e7fdd52738a","Type":"ContainerDied","Data":"c712793f9b6d56940b3d528cb78b2bef7c30d47ea250ee50a3db7dd585927f94"} Apr 16 23:29:22.709920 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:22.709889 2564 scope.go:117] "RemoveContainer" containerID="c712793f9b6d56940b3d528cb78b2bef7c30d47ea250ee50a3db7dd585927f94" Apr 16 23:29:23.714496 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:23.714465 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67vgr" event={"ID":"2665ad6e-102c-40fd-8fac-4e7fdd52738a","Type":"ContainerStarted","Data":"7c0432da9ab9c686adb07d8b1e596dcf9aba7ffcf2229a72d8327b3b89750407"} Apr 16 23:29:26.293746 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:26.293714 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:29:26.297641 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:26.297618 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6db978cd86-8m9t5" Apr 16 23:29:37.754961 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:37.754927 2564 generic.go:358] "Generic (PLEG): container finished" podID="a5316185-790a-4a40-b230-e6cc6cc0b80b" containerID="d8305379514916e55559ef5f41eac1f5c23d28616d57e3efeb30a968b2003b23" exitCode=0 Apr 16 23:29:37.755394 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:37.755001 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-69rmt" event={"ID":"a5316185-790a-4a40-b230-e6cc6cc0b80b","Type":"ContainerDied","Data":"d8305379514916e55559ef5f41eac1f5c23d28616d57e3efeb30a968b2003b23"} Apr 16 23:29:37.755394 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:37.755370 2564 scope.go:117] "RemoveContainer" containerID="d8305379514916e55559ef5f41eac1f5c23d28616d57e3efeb30a968b2003b23" Apr 16 23:29:38.759394 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:38.759358 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-69rmt" event={"ID":"a5316185-790a-4a40-b230-e6cc6cc0b80b","Type":"ContainerStarted","Data":"3a47005192cc9b6b1296702dbb49cd23af45d45aaa2684adfaf8027aa4a835f3"} Apr 16 23:29:44.783309 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:44.783267 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-bf9d7b449-82fvm" podUID="9d861541-9a91-4298-bac7-0ca08bf9f31b" containerName="console" containerID="cri-o://afe6521c3917ac09e03f6b8820b43c230fed91b79dcbb1884d0a32f3f6d51a3e" gracePeriod=15 Apr 16 23:29:45.049899 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.049876 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bf9d7b449-82fvm_9d861541-9a91-4298-bac7-0ca08bf9f31b/console/0.log" Apr 16 23:29:45.050015 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.049937 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:45.160014 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.159975 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-serving-cert\") pod \"9d861541-9a91-4298-bac7-0ca08bf9f31b\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " Apr 16 23:29:45.160218 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.160054 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-477hd\" (UniqueName: \"kubernetes.io/projected/9d861541-9a91-4298-bac7-0ca08bf9f31b-kube-api-access-477hd\") pod \"9d861541-9a91-4298-bac7-0ca08bf9f31b\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " Apr 16 23:29:45.160218 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.160089 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-service-ca\") pod \"9d861541-9a91-4298-bac7-0ca08bf9f31b\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " Apr 16 23:29:45.160218 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.160124 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-oauth-config\") pod \"9d861541-9a91-4298-bac7-0ca08bf9f31b\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " Apr 16 23:29:45.160218 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.160156 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-config\") pod \"9d861541-9a91-4298-bac7-0ca08bf9f31b\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " Apr 16 23:29:45.160218 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.160191 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-oauth-serving-cert\") pod \"9d861541-9a91-4298-bac7-0ca08bf9f31b\" (UID: \"9d861541-9a91-4298-bac7-0ca08bf9f31b\") " Apr 16 23:29:45.160625 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.160592 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-config" (OuterVolumeSpecName: "console-config") pod "9d861541-9a91-4298-bac7-0ca08bf9f31b" (UID: "9d861541-9a91-4298-bac7-0ca08bf9f31b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:29:45.160764 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.160637 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9d861541-9a91-4298-bac7-0ca08bf9f31b" (UID: "9d861541-9a91-4298-bac7-0ca08bf9f31b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:29:45.160821 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.160772 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-service-ca" (OuterVolumeSpecName: "service-ca") pod "9d861541-9a91-4298-bac7-0ca08bf9f31b" (UID: "9d861541-9a91-4298-bac7-0ca08bf9f31b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:29:45.162346 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.162317 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9d861541-9a91-4298-bac7-0ca08bf9f31b" (UID: "9d861541-9a91-4298-bac7-0ca08bf9f31b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:29:45.162637 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.162612 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9d861541-9a91-4298-bac7-0ca08bf9f31b" (UID: "9d861541-9a91-4298-bac7-0ca08bf9f31b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:29:45.162637 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.162624 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d861541-9a91-4298-bac7-0ca08bf9f31b-kube-api-access-477hd" (OuterVolumeSpecName: "kube-api-access-477hd") pod "9d861541-9a91-4298-bac7-0ca08bf9f31b" (UID: "9d861541-9a91-4298-bac7-0ca08bf9f31b"). InnerVolumeSpecName "kube-api-access-477hd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:29:45.261899 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.261865 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-oauth-serving-cert\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:29:45.261899 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.261893 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-serving-cert\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:29:45.261899 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.261903 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-477hd\" (UniqueName: \"kubernetes.io/projected/9d861541-9a91-4298-bac7-0ca08bf9f31b-kube-api-access-477hd\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:29:45.262117 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.261912 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-service-ca\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:29:45.262117 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.261922 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-oauth-config\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:29:45.262117 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.261931 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d861541-9a91-4298-bac7-0ca08bf9f31b-console-config\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:29:45.786287 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.786260 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bf9d7b449-82fvm_9d861541-9a91-4298-bac7-0ca08bf9f31b/console/0.log" Apr 16 23:29:45.786737 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.786303 2564 generic.go:358] "Generic (PLEG): container finished" podID="9d861541-9a91-4298-bac7-0ca08bf9f31b" containerID="afe6521c3917ac09e03f6b8820b43c230fed91b79dcbb1884d0a32f3f6d51a3e" exitCode=2 Apr 16 23:29:45.786737 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.786354 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bf9d7b449-82fvm" event={"ID":"9d861541-9a91-4298-bac7-0ca08bf9f31b","Type":"ContainerDied","Data":"afe6521c3917ac09e03f6b8820b43c230fed91b79dcbb1884d0a32f3f6d51a3e"} Apr 16 23:29:45.786737 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.786383 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bf9d7b449-82fvm" Apr 16 23:29:45.786737 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.786390 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bf9d7b449-82fvm" event={"ID":"9d861541-9a91-4298-bac7-0ca08bf9f31b","Type":"ContainerDied","Data":"4e55024354ac3fd7a3470cda673f7477484870ee511ebd8b2d0f5b564aa0a760"} Apr 16 23:29:45.786737 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.786413 2564 scope.go:117] "RemoveContainer" containerID="afe6521c3917ac09e03f6b8820b43c230fed91b79dcbb1884d0a32f3f6d51a3e" Apr 16 23:29:45.794390 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.794373 2564 scope.go:117] "RemoveContainer" containerID="afe6521c3917ac09e03f6b8820b43c230fed91b79dcbb1884d0a32f3f6d51a3e" Apr 16 23:29:45.794651 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:29:45.794634 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afe6521c3917ac09e03f6b8820b43c230fed91b79dcbb1884d0a32f3f6d51a3e\": container with ID starting with afe6521c3917ac09e03f6b8820b43c230fed91b79dcbb1884d0a32f3f6d51a3e not found: ID does not exist" containerID="afe6521c3917ac09e03f6b8820b43c230fed91b79dcbb1884d0a32f3f6d51a3e" Apr 16 23:29:45.794692 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.794659 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afe6521c3917ac09e03f6b8820b43c230fed91b79dcbb1884d0a32f3f6d51a3e"} err="failed to get container status \"afe6521c3917ac09e03f6b8820b43c230fed91b79dcbb1884d0a32f3f6d51a3e\": rpc error: code = NotFound desc = could not find container \"afe6521c3917ac09e03f6b8820b43c230fed91b79dcbb1884d0a32f3f6d51a3e\": container with ID starting with afe6521c3917ac09e03f6b8820b43c230fed91b79dcbb1884d0a32f3f6d51a3e not found: ID does not exist" Apr 16 23:29:45.805180 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.805155 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bf9d7b449-82fvm"] Apr 16 23:29:45.808482 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.808460 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-bf9d7b449-82fvm"] Apr 16 23:29:45.982123 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:29:45.982085 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d861541-9a91-4298-bac7-0ca08bf9f31b" path="/var/lib/kubelet/pods/9d861541-9a91-4298-bac7-0ca08bf9f31b/volumes" Apr 16 23:30:10.779478 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:10.779386 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs\") pod \"network-metrics-daemon-zp26z\" (UID: \"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2\") " pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:30:10.781790 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:10.781766 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2-metrics-certs\") pod \"network-metrics-daemon-zp26z\" (UID: \"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2\") " pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:30:10.881301 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:10.881273 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wl6xg\"" Apr 16 23:30:10.889903 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:10.889882 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zp26z" Apr 16 23:30:11.008560 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:11.008527 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zp26z"] Apr 16 23:30:11.011502 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:30:11.011473 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ac28c9_9db2_43a5_bb52_bc7930fe2ab2.slice/crio-c073bd6d31cb62bd78055f961c5b7900623f69f4ce35c6546b8f4298c7eaec4d WatchSource:0}: Error finding container c073bd6d31cb62bd78055f961c5b7900623f69f4ce35c6546b8f4298c7eaec4d: Status 404 returned error can't find the container with id c073bd6d31cb62bd78055f961c5b7900623f69f4ce35c6546b8f4298c7eaec4d Apr 16 23:30:11.861554 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:11.861518 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zp26z" event={"ID":"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2","Type":"ContainerStarted","Data":"c073bd6d31cb62bd78055f961c5b7900623f69f4ce35c6546b8f4298c7eaec4d"} Apr 16 23:30:12.866772 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:12.866732 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zp26z" event={"ID":"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2","Type":"ContainerStarted","Data":"5347f19a997135ec4719b32ae5b546e7bffe847f970601f7348fbdc212bdabd3"} Apr 16 23:30:12.866772 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:12.866774 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zp26z" event={"ID":"e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2","Type":"ContainerStarted","Data":"82616e84c48157f1ceec34b61b9350524bbe2d4c8c8572d053bcf7e35f4ded2d"} Apr 16 23:30:12.880968 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:12.880907 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zp26z" podStartSLOduration=251.841642581 podStartE2EDuration="4m12.880890088s" podCreationTimestamp="2026-04-16 23:26:00 +0000 UTC" firstStartedPulling="2026-04-16 23:30:11.013652693 +0000 UTC m=+251.629684150" lastFinishedPulling="2026-04-16 23:30:12.052900194 +0000 UTC m=+252.668931657" observedRunningTime="2026-04-16 23:30:12.879846535 +0000 UTC m=+253.495878006" watchObservedRunningTime="2026-04-16 23:30:12.880890088 +0000 UTC m=+253.496921578" Apr 16 23:30:15.637216 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.637180 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9c56f4866-cjq25"] Apr 16 23:30:15.637638 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.637615 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d861541-9a91-4298-bac7-0ca08bf9f31b" containerName="console" Apr 16 23:30:15.637690 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.637645 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d861541-9a91-4298-bac7-0ca08bf9f31b" containerName="console" Apr 16 23:30:15.637804 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.637783 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d861541-9a91-4298-bac7-0ca08bf9f31b" containerName="console" Apr 16 23:30:15.641064 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.641047 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.651064 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.651038 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9c56f4866-cjq25"] Apr 16 23:30:15.720458 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.720424 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-oauth-config\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.720645 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.720467 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-serving-cert\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.720645 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.720524 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjxnh\" (UniqueName: \"kubernetes.io/projected/a868195b-60e2-4cfe-9cbf-ff92b5125a73-kube-api-access-bjxnh\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.720645 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.720614 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-oauth-serving-cert\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.720821 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.720659 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-service-ca\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.720821 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.720675 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-trusted-ca-bundle\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.720821 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.720773 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-config\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.821525 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.821488 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-service-ca\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.821749 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.821538 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-trusted-ca-bundle\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.821749 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.821656 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-config\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.821749 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.821730 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-oauth-config\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.821749 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.821752 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-serving-cert\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.821981 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.821779 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjxnh\" (UniqueName: \"kubernetes.io/projected/a868195b-60e2-4cfe-9cbf-ff92b5125a73-kube-api-access-bjxnh\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.821981 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.821819 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-oauth-serving-cert\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.822388 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.822357 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-service-ca\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.822509 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.822413 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-config\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.822509 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.822447 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-trusted-ca-bundle\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.822509 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.822472 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-oauth-serving-cert\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.824264 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.824244 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-serving-cert\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.824323 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.824259 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-oauth-config\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.828792 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.828771 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjxnh\" (UniqueName: \"kubernetes.io/projected/a868195b-60e2-4cfe-9cbf-ff92b5125a73-kube-api-access-bjxnh\") pod \"console-9c56f4866-cjq25\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:15.952761 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:15.952650 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:16.077411 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:16.077381 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9c56f4866-cjq25"] Apr 16 23:30:16.080711 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:30:16.080666 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda868195b_60e2_4cfe_9cbf_ff92b5125a73.slice/crio-0f55a17083b258d0999901a5cd119384ef50823935f7c830f2a6cc72f9be402a WatchSource:0}: Error finding container 0f55a17083b258d0999901a5cd119384ef50823935f7c830f2a6cc72f9be402a: Status 404 returned error can't find the container with id 0f55a17083b258d0999901a5cd119384ef50823935f7c830f2a6cc72f9be402a Apr 16 23:30:16.880737 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:16.880685 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c56f4866-cjq25" event={"ID":"a868195b-60e2-4cfe-9cbf-ff92b5125a73","Type":"ContainerStarted","Data":"b2031889a7407343be2ba9ff8dce02d7123f1360b0f32afc414896eb1021812c"} Apr 16 23:30:16.880737 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:16.880743 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c56f4866-cjq25" event={"ID":"a868195b-60e2-4cfe-9cbf-ff92b5125a73","Type":"ContainerStarted","Data":"0f55a17083b258d0999901a5cd119384ef50823935f7c830f2a6cc72f9be402a"} Apr 16 23:30:16.896963 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:16.896910 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9c56f4866-cjq25" podStartSLOduration=1.8968943029999998 podStartE2EDuration="1.896894303s" podCreationTimestamp="2026-04-16 23:30:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:30:16.895340346 +0000 UTC m=+257.511371836" watchObservedRunningTime="2026-04-16 23:30:16.896894303 +0000 UTC m=+257.512925783" Apr 16 23:30:25.953461 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:25.953415 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:25.953461 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:25.953469 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:25.958335 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:25.958310 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:26.913838 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:26.913808 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:30:26.955763 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:26.955729 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5dbd66f789-d4wvd"] Apr 16 23:30:51.975486 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:51.975424 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5dbd66f789-d4wvd" podUID="96951181-f52d-4ace-9e00-53ed27d2e6cc" containerName="console" containerID="cri-o://c08f62b3f5543d37c082d02377eb4e1354206679a189fe0f2e725b89bf8c45af" gracePeriod=15 Apr 16 23:30:52.224022 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.223986 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5dbd66f789-d4wvd_96951181-f52d-4ace-9e00-53ed27d2e6cc/console/0.log" Apr 16 23:30:52.224136 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.224055 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:30:52.234187 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.234136 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-serving-cert\") pod \"96951181-f52d-4ace-9e00-53ed27d2e6cc\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " Apr 16 23:30:52.234187 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.234168 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-service-ca\") pod \"96951181-f52d-4ace-9e00-53ed27d2e6cc\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " Apr 16 23:30:52.234281 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.234192 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-trusted-ca-bundle\") pod \"96951181-f52d-4ace-9e00-53ed27d2e6cc\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " Apr 16 23:30:52.234281 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.234223 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-config\") pod \"96951181-f52d-4ace-9e00-53ed27d2e6cc\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " Apr 16 23:30:52.234358 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.234292 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-oauth-serving-cert\") pod \"96951181-f52d-4ace-9e00-53ed27d2e6cc\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " Apr 16 23:30:52.234358 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.234325 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-oauth-config\") pod \"96951181-f52d-4ace-9e00-53ed27d2e6cc\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " Apr 16 23:30:52.234454 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.234367 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmbrw\" (UniqueName: \"kubernetes.io/projected/96951181-f52d-4ace-9e00-53ed27d2e6cc-kube-api-access-rmbrw\") pod \"96951181-f52d-4ace-9e00-53ed27d2e6cc\" (UID: \"96951181-f52d-4ace-9e00-53ed27d2e6cc\") " Apr 16 23:30:52.234609 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.234575 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-service-ca" (OuterVolumeSpecName: "service-ca") pod "96951181-f52d-4ace-9e00-53ed27d2e6cc" (UID: "96951181-f52d-4ace-9e00-53ed27d2e6cc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:30:52.234609 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.234587 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "96951181-f52d-4ace-9e00-53ed27d2e6cc" (UID: "96951181-f52d-4ace-9e00-53ed27d2e6cc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:30:52.234609 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.234597 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-config" (OuterVolumeSpecName: "console-config") pod "96951181-f52d-4ace-9e00-53ed27d2e6cc" (UID: "96951181-f52d-4ace-9e00-53ed27d2e6cc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:30:52.234781 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.234674 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "96951181-f52d-4ace-9e00-53ed27d2e6cc" (UID: "96951181-f52d-4ace-9e00-53ed27d2e6cc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:30:52.236302 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.236274 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "96951181-f52d-4ace-9e00-53ed27d2e6cc" (UID: "96951181-f52d-4ace-9e00-53ed27d2e6cc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:52.236412 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.236357 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "96951181-f52d-4ace-9e00-53ed27d2e6cc" (UID: "96951181-f52d-4ace-9e00-53ed27d2e6cc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:52.236455 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.236426 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96951181-f52d-4ace-9e00-53ed27d2e6cc-kube-api-access-rmbrw" (OuterVolumeSpecName: "kube-api-access-rmbrw") pod "96951181-f52d-4ace-9e00-53ed27d2e6cc" (UID: "96951181-f52d-4ace-9e00-53ed27d2e6cc"). InnerVolumeSpecName "kube-api-access-rmbrw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:30:52.335651 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.335613 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rmbrw\" (UniqueName: \"kubernetes.io/projected/96951181-f52d-4ace-9e00-53ed27d2e6cc-kube-api-access-rmbrw\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:30:52.335651 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.335645 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-serving-cert\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:30:52.335651 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.335655 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-service-ca\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:30:52.335901 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.335665 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-trusted-ca-bundle\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:30:52.335901 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.335675 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-config\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:30:52.335901 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.335685 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96951181-f52d-4ace-9e00-53ed27d2e6cc-oauth-serving-cert\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:30:52.335901 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.335693 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96951181-f52d-4ace-9e00-53ed27d2e6cc-console-oauth-config\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:30:52.987784 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.987756 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5dbd66f789-d4wvd_96951181-f52d-4ace-9e00-53ed27d2e6cc/console/0.log" Apr 16 23:30:52.988181 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.987798 2564 generic.go:358] "Generic (PLEG): container finished" podID="96951181-f52d-4ace-9e00-53ed27d2e6cc" containerID="c08f62b3f5543d37c082d02377eb4e1354206679a189fe0f2e725b89bf8c45af" exitCode=2 Apr 16 23:30:52.988181 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.987834 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dbd66f789-d4wvd" event={"ID":"96951181-f52d-4ace-9e00-53ed27d2e6cc","Type":"ContainerDied","Data":"c08f62b3f5543d37c082d02377eb4e1354206679a189fe0f2e725b89bf8c45af"} Apr 16 23:30:52.988181 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.987855 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dbd66f789-d4wvd" event={"ID":"96951181-f52d-4ace-9e00-53ed27d2e6cc","Type":"ContainerDied","Data":"4f8bad5d8533ead6368e9cc93e841f9a35c0520f5dbffac827fe751c3c4c3888"} Apr 16 23:30:52.988181 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.987860 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dbd66f789-d4wvd" Apr 16 23:30:52.988181 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.987870 2564 scope.go:117] "RemoveContainer" containerID="c08f62b3f5543d37c082d02377eb4e1354206679a189fe0f2e725b89bf8c45af" Apr 16 23:30:52.997112 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.997094 2564 scope.go:117] "RemoveContainer" containerID="c08f62b3f5543d37c082d02377eb4e1354206679a189fe0f2e725b89bf8c45af" Apr 16 23:30:52.997366 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:30:52.997345 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c08f62b3f5543d37c082d02377eb4e1354206679a189fe0f2e725b89bf8c45af\": container with ID starting with c08f62b3f5543d37c082d02377eb4e1354206679a189fe0f2e725b89bf8c45af not found: ID does not exist" containerID="c08f62b3f5543d37c082d02377eb4e1354206679a189fe0f2e725b89bf8c45af" Apr 16 23:30:52.997429 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:52.997374 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08f62b3f5543d37c082d02377eb4e1354206679a189fe0f2e725b89bf8c45af"} err="failed to get container status \"c08f62b3f5543d37c082d02377eb4e1354206679a189fe0f2e725b89bf8c45af\": rpc error: code = NotFound desc = could not find container \"c08f62b3f5543d37c082d02377eb4e1354206679a189fe0f2e725b89bf8c45af\": container with ID starting with c08f62b3f5543d37c082d02377eb4e1354206679a189fe0f2e725b89bf8c45af not found: ID does not exist" Apr 16 23:30:53.007164 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:53.007137 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5dbd66f789-d4wvd"] Apr 16 23:30:53.011727 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:53.011686 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5dbd66f789-d4wvd"] Apr 16 23:30:53.981640 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:53.981602 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96951181-f52d-4ace-9e00-53ed27d2e6cc" path="/var/lib/kubelet/pods/96951181-f52d-4ace-9e00-53ed27d2e6cc/volumes" Apr 16 23:30:59.033991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.033951 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr"] Apr 16 23:30:59.034406 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.034246 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96951181-f52d-4ace-9e00-53ed27d2e6cc" containerName="console" Apr 16 23:30:59.034406 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.034256 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="96951181-f52d-4ace-9e00-53ed27d2e6cc" containerName="console" Apr 16 23:30:59.034406 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.034308 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="96951181-f52d-4ace-9e00-53ed27d2e6cc" containerName="console" Apr 16 23:30:59.038991 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.038972 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" Apr 16 23:30:59.041120 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.041096 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-g6fmt\"" Apr 16 23:30:59.041270 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.041100 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 23:30:59.042191 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.042169 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 23:30:59.044012 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.043981 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr"] Apr 16 23:30:59.086045 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.086006 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b9d2\" (UniqueName: \"kubernetes.io/projected/e270ae9c-2750-4ffb-902d-c7607b0b1899-kube-api-access-7b9d2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr\" (UID: \"e270ae9c-2750-4ffb-902d-c7607b0b1899\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" Apr 16 23:30:59.086178 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.086060 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e270ae9c-2750-4ffb-902d-c7607b0b1899-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr\" (UID: \"e270ae9c-2750-4ffb-902d-c7607b0b1899\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" Apr 16 23:30:59.086178 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.086089 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e270ae9c-2750-4ffb-902d-c7607b0b1899-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr\" (UID: \"e270ae9c-2750-4ffb-902d-c7607b0b1899\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" Apr 16 23:30:59.187480 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.187443 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e270ae9c-2750-4ffb-902d-c7607b0b1899-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr\" (UID: \"e270ae9c-2750-4ffb-902d-c7607b0b1899\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" Apr 16 23:30:59.187661 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.187493 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e270ae9c-2750-4ffb-902d-c7607b0b1899-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr\" (UID: \"e270ae9c-2750-4ffb-902d-c7607b0b1899\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" Apr 16 23:30:59.187661 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.187551 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7b9d2\" (UniqueName: \"kubernetes.io/projected/e270ae9c-2750-4ffb-902d-c7607b0b1899-kube-api-access-7b9d2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr\" (UID: \"e270ae9c-2750-4ffb-902d-c7607b0b1899\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" Apr 16 23:30:59.187824 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.187805 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e270ae9c-2750-4ffb-902d-c7607b0b1899-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr\" (UID: \"e270ae9c-2750-4ffb-902d-c7607b0b1899\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" Apr 16 23:30:59.187882 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.187843 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e270ae9c-2750-4ffb-902d-c7607b0b1899-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr\" (UID: \"e270ae9c-2750-4ffb-902d-c7607b0b1899\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" Apr 16 23:30:59.194751 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.194734 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b9d2\" (UniqueName: \"kubernetes.io/projected/e270ae9c-2750-4ffb-902d-c7607b0b1899-kube-api-access-7b9d2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr\" (UID: \"e270ae9c-2750-4ffb-902d-c7607b0b1899\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" Apr 16 23:30:59.349534 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.349442 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" Apr 16 23:30:59.468735 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.468681 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr"] Apr 16 23:30:59.471414 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:30:59.471381 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode270ae9c_2750_4ffb_902d_c7607b0b1899.slice/crio-8d922b833b4d2f0e7f94a560a57cb35320e899cee1c48bd3b5c58d1f4cf31a84 WatchSource:0}: Error finding container 8d922b833b4d2f0e7f94a560a57cb35320e899cee1c48bd3b5c58d1f4cf31a84: Status 404 returned error can't find the container with id 8d922b833b4d2f0e7f94a560a57cb35320e899cee1c48bd3b5c58d1f4cf31a84 Apr 16 23:30:59.883109 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.883084 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 16 23:30:59.883287 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.883180 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 16 23:30:59.886280 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.886261 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:30:59.886280 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.886267 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:30:59.898411 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:30:59.898384 2564 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 23:31:00.009384 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:00.009345 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" event={"ID":"e270ae9c-2750-4ffb-902d-c7607b0b1899","Type":"ContainerStarted","Data":"8d922b833b4d2f0e7f94a560a57cb35320e899cee1c48bd3b5c58d1f4cf31a84"} Apr 16 23:31:07.031744 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:07.031691 2564 generic.go:358] "Generic (PLEG): container finished" podID="e270ae9c-2750-4ffb-902d-c7607b0b1899" containerID="d14cc96743cd25fc50fadb48d98fd00bcf271767a9cfeaa4d638a7485dc6a30e" exitCode=0 Apr 16 23:31:07.032127 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:07.031756 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" event={"ID":"e270ae9c-2750-4ffb-902d-c7607b0b1899","Type":"ContainerDied","Data":"d14cc96743cd25fc50fadb48d98fd00bcf271767a9cfeaa4d638a7485dc6a30e"} Apr 16 23:31:07.032734 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:07.032719 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:31:10.042023 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:10.041978 2564 generic.go:358] "Generic (PLEG): container finished" podID="e270ae9c-2750-4ffb-902d-c7607b0b1899" containerID="e5acd3a30afaeb5e036707797c7aafd7a2a23c863b43001acb3102ed219a9b5e" exitCode=0 Apr 16 23:31:10.042474 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:10.042033 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" event={"ID":"e270ae9c-2750-4ffb-902d-c7607b0b1899","Type":"ContainerDied","Data":"e5acd3a30afaeb5e036707797c7aafd7a2a23c863b43001acb3102ed219a9b5e"} Apr 16 23:31:19.078517 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:19.078482 2564 generic.go:358] "Generic (PLEG): container finished" podID="e270ae9c-2750-4ffb-902d-c7607b0b1899" containerID="4cac6fd32d427b96f2a3bab31392b930b5b1a21b4eaf7d259941e8cbdb7e1d39" exitCode=0 Apr 16 23:31:19.078950 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:19.078568 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" event={"ID":"e270ae9c-2750-4ffb-902d-c7607b0b1899","Type":"ContainerDied","Data":"4cac6fd32d427b96f2a3bab31392b930b5b1a21b4eaf7d259941e8cbdb7e1d39"} Apr 16 23:31:20.205431 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:20.205408 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" Apr 16 23:31:20.275644 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:20.275606 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e270ae9c-2750-4ffb-902d-c7607b0b1899-bundle\") pod \"e270ae9c-2750-4ffb-902d-c7607b0b1899\" (UID: \"e270ae9c-2750-4ffb-902d-c7607b0b1899\") " Apr 16 23:31:20.275849 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:20.275776 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e270ae9c-2750-4ffb-902d-c7607b0b1899-util\") pod \"e270ae9c-2750-4ffb-902d-c7607b0b1899\" (UID: \"e270ae9c-2750-4ffb-902d-c7607b0b1899\") " Apr 16 23:31:20.275849 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:20.275808 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b9d2\" (UniqueName: \"kubernetes.io/projected/e270ae9c-2750-4ffb-902d-c7607b0b1899-kube-api-access-7b9d2\") pod \"e270ae9c-2750-4ffb-902d-c7607b0b1899\" (UID: \"e270ae9c-2750-4ffb-902d-c7607b0b1899\") " Apr 16 23:31:20.276237 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:20.276209 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e270ae9c-2750-4ffb-902d-c7607b0b1899-bundle" (OuterVolumeSpecName: "bundle") pod "e270ae9c-2750-4ffb-902d-c7607b0b1899" (UID: "e270ae9c-2750-4ffb-902d-c7607b0b1899"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:31:20.278002 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:20.277976 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e270ae9c-2750-4ffb-902d-c7607b0b1899-kube-api-access-7b9d2" (OuterVolumeSpecName: "kube-api-access-7b9d2") pod "e270ae9c-2750-4ffb-902d-c7607b0b1899" (UID: "e270ae9c-2750-4ffb-902d-c7607b0b1899"). InnerVolumeSpecName "kube-api-access-7b9d2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:31:20.280378 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:20.280356 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e270ae9c-2750-4ffb-902d-c7607b0b1899-util" (OuterVolumeSpecName: "util") pod "e270ae9c-2750-4ffb-902d-c7607b0b1899" (UID: "e270ae9c-2750-4ffb-902d-c7607b0b1899"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:31:20.376605 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:20.376525 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e270ae9c-2750-4ffb-902d-c7607b0b1899-bundle\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:31:20.376605 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:20.376555 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e270ae9c-2750-4ffb-902d-c7607b0b1899-util\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:31:20.376605 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:20.376565 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7b9d2\" (UniqueName: \"kubernetes.io/projected/e270ae9c-2750-4ffb-902d-c7607b0b1899-kube-api-access-7b9d2\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:31:21.086040 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:21.086004 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" event={"ID":"e270ae9c-2750-4ffb-902d-c7607b0b1899","Type":"ContainerDied","Data":"8d922b833b4d2f0e7f94a560a57cb35320e899cee1c48bd3b5c58d1f4cf31a84"} Apr 16 23:31:21.086040 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:21.086039 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nxtrr" Apr 16 23:31:21.086040 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:21.086044 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d922b833b4d2f0e7f94a560a57cb35320e899cee1c48bd3b5c58d1f4cf31a84" Apr 16 23:31:26.093857 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.093824 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n6nf5"] Apr 16 23:31:26.094527 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.094183 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e270ae9c-2750-4ffb-902d-c7607b0b1899" containerName="util" Apr 16 23:31:26.094527 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.094198 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e270ae9c-2750-4ffb-902d-c7607b0b1899" containerName="util" Apr 16 23:31:26.094527 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.094227 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e270ae9c-2750-4ffb-902d-c7607b0b1899" containerName="pull" Apr 16 23:31:26.094527 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.094234 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e270ae9c-2750-4ffb-902d-c7607b0b1899" containerName="pull" Apr 16 23:31:26.094527 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.094243 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e270ae9c-2750-4ffb-902d-c7607b0b1899" containerName="extract" Apr 16 23:31:26.094527 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.094252 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e270ae9c-2750-4ffb-902d-c7607b0b1899" containerName="extract" Apr 16 23:31:26.094527 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.094334 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="e270ae9c-2750-4ffb-902d-c7607b0b1899" containerName="extract" Apr 16 23:31:26.098063 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.098044 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n6nf5" Apr 16 23:31:26.100201 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.100183 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:31:26.100302 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.100204 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-ksxp6\"" Apr 16 23:31:26.100302 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.100186 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 23:31:26.107634 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.107605 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n6nf5"] Apr 16 23:31:26.229242 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.229216 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/668c1c1b-45ad-4139-8530-499603720b9c-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-n6nf5\" (UID: \"668c1c1b-45ad-4139-8530-499603720b9c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n6nf5" Apr 16 23:31:26.229403 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.229273 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psjjj\" (UniqueName: \"kubernetes.io/projected/668c1c1b-45ad-4139-8530-499603720b9c-kube-api-access-psjjj\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-n6nf5\" (UID: \"668c1c1b-45ad-4139-8530-499603720b9c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n6nf5" Apr 16 23:31:26.329894 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.329860 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/668c1c1b-45ad-4139-8530-499603720b9c-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-n6nf5\" (UID: \"668c1c1b-45ad-4139-8530-499603720b9c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n6nf5" Apr 16 23:31:26.330079 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.329935 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psjjj\" (UniqueName: \"kubernetes.io/projected/668c1c1b-45ad-4139-8530-499603720b9c-kube-api-access-psjjj\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-n6nf5\" (UID: \"668c1c1b-45ad-4139-8530-499603720b9c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n6nf5" Apr 16 23:31:26.330254 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.330234 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/668c1c1b-45ad-4139-8530-499603720b9c-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-n6nf5\" (UID: \"668c1c1b-45ad-4139-8530-499603720b9c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n6nf5" Apr 16 23:31:26.358425 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.358339 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psjjj\" (UniqueName: \"kubernetes.io/projected/668c1c1b-45ad-4139-8530-499603720b9c-kube-api-access-psjjj\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-n6nf5\" (UID: \"668c1c1b-45ad-4139-8530-499603720b9c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n6nf5" Apr 16 23:31:26.408102 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.408060 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n6nf5" Apr 16 23:31:26.556109 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:26.556076 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n6nf5"] Apr 16 23:31:26.561301 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:31:26.561258 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod668c1c1b_45ad_4139_8530_499603720b9c.slice/crio-f23a2c4a189216ec96176735b6f7cbe81ba8c2c986fa0f06e69e7b2d658f91e4 WatchSource:0}: Error finding container f23a2c4a189216ec96176735b6f7cbe81ba8c2c986fa0f06e69e7b2d658f91e4: Status 404 returned error can't find the container with id f23a2c4a189216ec96176735b6f7cbe81ba8c2c986fa0f06e69e7b2d658f91e4 Apr 16 23:31:27.104231 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:27.104190 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n6nf5" event={"ID":"668c1c1b-45ad-4139-8530-499603720b9c","Type":"ContainerStarted","Data":"f23a2c4a189216ec96176735b6f7cbe81ba8c2c986fa0f06e69e7b2d658f91e4"} Apr 16 23:31:30.115609 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:30.115570 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n6nf5" event={"ID":"668c1c1b-45ad-4139-8530-499603720b9c","Type":"ContainerStarted","Data":"95ff01df4cde6d5687285111dfff8eafc50f17f63dd8a7ca21cbf70409174257"} Apr 16 23:31:30.133101 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:30.133051 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-n6nf5" podStartSLOduration=1.363043837 podStartE2EDuration="4.133034479s" podCreationTimestamp="2026-04-16 23:31:26 +0000 UTC" firstStartedPulling="2026-04-16 23:31:26.563832385 +0000 UTC m=+327.179863849" lastFinishedPulling="2026-04-16 23:31:29.33382303 +0000 UTC m=+329.949854491" observedRunningTime="2026-04-16 23:31:30.131802974 +0000 UTC m=+330.747834451" watchObservedRunningTime="2026-04-16 23:31:30.133034479 +0000 UTC m=+330.749065957" Apr 16 23:31:31.106845 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.106812 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz"] Apr 16 23:31:31.110392 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.110374 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" Apr 16 23:31:31.112489 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.112467 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-g6fmt\"" Apr 16 23:31:31.112588 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.112467 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 23:31:31.113463 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.113448 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 23:31:31.116749 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.116730 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz"] Apr 16 23:31:31.271072 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.271030 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq4sr\" (UniqueName: \"kubernetes.io/projected/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-kube-api-access-lq4sr\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz\" (UID: \"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" Apr 16 23:31:31.271072 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.271082 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz\" (UID: \"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" Apr 16 23:31:31.271290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.271218 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz\" (UID: \"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" Apr 16 23:31:31.371925 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.371837 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz\" (UID: \"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" Apr 16 23:31:31.371925 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.371896 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lq4sr\" (UniqueName: \"kubernetes.io/projected/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-kube-api-access-lq4sr\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz\" (UID: \"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" Apr 16 23:31:31.372123 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.371936 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz\" (UID: \"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" Apr 16 23:31:31.372318 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.372295 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz\" (UID: \"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" Apr 16 23:31:31.372354 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.372322 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz\" (UID: \"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" Apr 16 23:31:31.379244 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.379220 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq4sr\" (UniqueName: \"kubernetes.io/projected/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-kube-api-access-lq4sr\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz\" (UID: \"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" Apr 16 23:31:31.421248 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.421204 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" Apr 16 23:31:31.542675 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:31.542644 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz"] Apr 16 23:31:31.545768 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:31:31.545736 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod773c40b8_ed4f_4b28_84f9_8ccf9bda3fb6.slice/crio-152df0db073e8c081e1f0d1dcd0490dd739e87421c4cf65f649f399a475b118a WatchSource:0}: Error finding container 152df0db073e8c081e1f0d1dcd0490dd739e87421c4cf65f649f399a475b118a: Status 404 returned error can't find the container with id 152df0db073e8c081e1f0d1dcd0490dd739e87421c4cf65f649f399a475b118a Apr 16 23:31:32.128236 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:32.128194 2564 generic.go:358] "Generic (PLEG): container finished" podID="773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6" containerID="be7403f83794c02778e391f5cdd0d32f01cba908d3ce9ae00757fc53327ad50f" exitCode=0 Apr 16 23:31:32.128236 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:32.128241 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" event={"ID":"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6","Type":"ContainerDied","Data":"be7403f83794c02778e391f5cdd0d32f01cba908d3ce9ae00757fc53327ad50f"} Apr 16 23:31:32.128808 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:32.128264 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" event={"ID":"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6","Type":"ContainerStarted","Data":"152df0db073e8c081e1f0d1dcd0490dd739e87421c4cf65f649f399a475b118a"} Apr 16 23:31:33.491771 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:33.491736 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-j4fqx"] Apr 16 23:31:33.495086 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:33.495060 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-j4fqx" Apr 16 23:31:33.497258 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:33.497232 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 23:31:33.497388 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:33.497292 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 23:31:33.498735 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:33.498693 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-lbvzz\"" Apr 16 23:31:33.501279 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:33.501242 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-j4fqx"] Apr 16 23:31:33.592378 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:33.592349 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/385da084-dde7-4795-9894-31e8270cda01-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-j4fqx\" (UID: \"385da084-dde7-4795-9894-31e8270cda01\") " pod="cert-manager/cert-manager-webhook-597b96b99b-j4fqx" Apr 16 23:31:33.592526 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:33.592420 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84rrr\" (UniqueName: \"kubernetes.io/projected/385da084-dde7-4795-9894-31e8270cda01-kube-api-access-84rrr\") pod \"cert-manager-webhook-597b96b99b-j4fqx\" (UID: \"385da084-dde7-4795-9894-31e8270cda01\") " pod="cert-manager/cert-manager-webhook-597b96b99b-j4fqx" Apr 16 23:31:33.693533 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:33.693497 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84rrr\" (UniqueName: \"kubernetes.io/projected/385da084-dde7-4795-9894-31e8270cda01-kube-api-access-84rrr\") pod \"cert-manager-webhook-597b96b99b-j4fqx\" (UID: \"385da084-dde7-4795-9894-31e8270cda01\") " pod="cert-manager/cert-manager-webhook-597b96b99b-j4fqx" Apr 16 23:31:33.693693 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:33.693565 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/385da084-dde7-4795-9894-31e8270cda01-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-j4fqx\" (UID: \"385da084-dde7-4795-9894-31e8270cda01\") " pod="cert-manager/cert-manager-webhook-597b96b99b-j4fqx" Apr 16 23:31:33.701345 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:33.701308 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/385da084-dde7-4795-9894-31e8270cda01-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-j4fqx\" (UID: \"385da084-dde7-4795-9894-31e8270cda01\") " pod="cert-manager/cert-manager-webhook-597b96b99b-j4fqx" Apr 16 23:31:33.701908 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:33.701886 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84rrr\" (UniqueName: \"kubernetes.io/projected/385da084-dde7-4795-9894-31e8270cda01-kube-api-access-84rrr\") pod \"cert-manager-webhook-597b96b99b-j4fqx\" (UID: \"385da084-dde7-4795-9894-31e8270cda01\") " pod="cert-manager/cert-manager-webhook-597b96b99b-j4fqx" Apr 16 23:31:33.815293 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:33.815211 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-j4fqx" Apr 16 23:31:33.937293 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:33.937270 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-j4fqx"] Apr 16 23:31:33.939028 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:31:33.938999 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod385da084_dde7_4795_9894_31e8270cda01.slice/crio-0d2bd6344fc615617d94236708abe99befe6d2dae4fd53649f8942884979cfc4 WatchSource:0}: Error finding container 0d2bd6344fc615617d94236708abe99befe6d2dae4fd53649f8942884979cfc4: Status 404 returned error can't find the container with id 0d2bd6344fc615617d94236708abe99befe6d2dae4fd53649f8942884979cfc4 Apr 16 23:31:34.136554 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:34.136469 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-j4fqx" event={"ID":"385da084-dde7-4795-9894-31e8270cda01","Type":"ContainerStarted","Data":"0d2bd6344fc615617d94236708abe99befe6d2dae4fd53649f8942884979cfc4"} Apr 16 23:31:36.116480 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:36.116441 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-xjjd8"] Apr 16 23:31:36.120322 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:36.120299 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-xjjd8" Apr 16 23:31:36.122606 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:36.122578 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-wq84v\"" Apr 16 23:31:36.125811 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:36.125772 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-xjjd8"] Apr 16 23:31:36.147646 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:36.147599 2564 generic.go:358] "Generic (PLEG): container finished" podID="773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6" containerID="2bb721aef302319bbdaa9730f302051657365dfc5d31639067c238f7e39080b6" exitCode=0 Apr 16 23:31:36.147840 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:36.147671 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" event={"ID":"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6","Type":"ContainerDied","Data":"2bb721aef302319bbdaa9730f302051657365dfc5d31639067c238f7e39080b6"} Apr 16 23:31:36.215121 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:36.215087 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2m7m\" (UniqueName: \"kubernetes.io/projected/22c0fa0d-195a-4fb3-883f-544ee7765725-kube-api-access-k2m7m\") pod \"cert-manager-cainjector-8966b78d4-xjjd8\" (UID: \"22c0fa0d-195a-4fb3-883f-544ee7765725\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xjjd8" Apr 16 23:31:36.215290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:36.215152 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22c0fa0d-195a-4fb3-883f-544ee7765725-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-xjjd8\" (UID: \"22c0fa0d-195a-4fb3-883f-544ee7765725\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xjjd8" Apr 16 23:31:36.316105 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:36.316066 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2m7m\" (UniqueName: \"kubernetes.io/projected/22c0fa0d-195a-4fb3-883f-544ee7765725-kube-api-access-k2m7m\") pod \"cert-manager-cainjector-8966b78d4-xjjd8\" (UID: \"22c0fa0d-195a-4fb3-883f-544ee7765725\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xjjd8" Apr 16 23:31:36.316304 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:36.316158 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22c0fa0d-195a-4fb3-883f-544ee7765725-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-xjjd8\" (UID: \"22c0fa0d-195a-4fb3-883f-544ee7765725\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xjjd8" Apr 16 23:31:36.324447 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:36.324413 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2m7m\" (UniqueName: \"kubernetes.io/projected/22c0fa0d-195a-4fb3-883f-544ee7765725-kube-api-access-k2m7m\") pod \"cert-manager-cainjector-8966b78d4-xjjd8\" (UID: \"22c0fa0d-195a-4fb3-883f-544ee7765725\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xjjd8" Apr 16 23:31:36.324613 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:36.324447 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22c0fa0d-195a-4fb3-883f-544ee7765725-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-xjjd8\" (UID: \"22c0fa0d-195a-4fb3-883f-544ee7765725\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xjjd8" Apr 16 23:31:36.432599 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:36.432505 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-xjjd8" Apr 16 23:31:36.930977 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:36.930925 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-xjjd8"] Apr 16 23:31:36.933622 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:31:36.933592 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22c0fa0d_195a_4fb3_883f_544ee7765725.slice/crio-f30c364e5d4a38b9375a58aa3afe68a20fc292c433147f4a37e7b99cb31202ab WatchSource:0}: Error finding container f30c364e5d4a38b9375a58aa3afe68a20fc292c433147f4a37e7b99cb31202ab: Status 404 returned error can't find the container with id f30c364e5d4a38b9375a58aa3afe68a20fc292c433147f4a37e7b99cb31202ab Apr 16 23:31:37.153299 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:37.153213 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-j4fqx" event={"ID":"385da084-dde7-4795-9894-31e8270cda01","Type":"ContainerStarted","Data":"ca669d88a3593a7260d22149debda6019b82159972c2d2c7e3f80247dbb3d65f"} Apr 16 23:31:37.153767 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:37.153317 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-j4fqx" Apr 16 23:31:37.154732 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:37.154693 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-xjjd8" event={"ID":"22c0fa0d-195a-4fb3-883f-544ee7765725","Type":"ContainerStarted","Data":"e604ffdd4a31719834ca8d134e45de88646263daceffd22eced09419fb86c06a"} Apr 16 23:31:37.154839 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:37.154736 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-xjjd8" event={"ID":"22c0fa0d-195a-4fb3-883f-544ee7765725","Type":"ContainerStarted","Data":"f30c364e5d4a38b9375a58aa3afe68a20fc292c433147f4a37e7b99cb31202ab"} Apr 16 23:31:37.157031 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:37.156977 2564 generic.go:358] "Generic (PLEG): container finished" podID="773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6" containerID="1a64f95e0b8908ab8b58d696209208b8b03eaa0315d9c2ffb60cc7dbbf6a47ea" exitCode=0 Apr 16 23:31:37.157031 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:37.157014 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" event={"ID":"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6","Type":"ContainerDied","Data":"1a64f95e0b8908ab8b58d696209208b8b03eaa0315d9c2ffb60cc7dbbf6a47ea"} Apr 16 23:31:37.170082 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:37.170037 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-j4fqx" podStartSLOduration=1.248005387 podStartE2EDuration="4.170023626s" podCreationTimestamp="2026-04-16 23:31:33 +0000 UTC" firstStartedPulling="2026-04-16 23:31:33.940948705 +0000 UTC m=+334.556980165" lastFinishedPulling="2026-04-16 23:31:36.862966947 +0000 UTC m=+337.478998404" observedRunningTime="2026-04-16 23:31:37.168175331 +0000 UTC m=+337.784206811" watchObservedRunningTime="2026-04-16 23:31:37.170023626 +0000 UTC m=+337.786055105" Apr 16 23:31:37.182380 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:37.182318 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-xjjd8" podStartSLOduration=1.182297765 podStartE2EDuration="1.182297765s" podCreationTimestamp="2026-04-16 23:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:31:37.180936967 +0000 UTC m=+337.796968449" watchObservedRunningTime="2026-04-16 23:31:37.182297765 +0000 UTC m=+337.798329248" Apr 16 23:31:38.278937 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:38.278914 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" Apr 16 23:31:38.434652 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:38.434560 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-util\") pod \"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6\" (UID: \"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6\") " Apr 16 23:31:38.434652 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:38.434609 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq4sr\" (UniqueName: \"kubernetes.io/projected/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-kube-api-access-lq4sr\") pod \"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6\" (UID: \"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6\") " Apr 16 23:31:38.434652 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:38.434651 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-bundle\") pod \"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6\" (UID: \"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6\") " Apr 16 23:31:38.435151 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:38.435117 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-bundle" (OuterVolumeSpecName: "bundle") pod "773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6" (UID: "773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:31:38.436782 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:38.436759 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-kube-api-access-lq4sr" (OuterVolumeSpecName: "kube-api-access-lq4sr") pod "773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6" (UID: "773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6"). InnerVolumeSpecName "kube-api-access-lq4sr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:31:38.438886 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:38.438863 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-util" (OuterVolumeSpecName: "util") pod "773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6" (UID: "773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:31:38.535774 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:38.535738 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-util\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:31:38.535774 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:38.535767 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lq4sr\" (UniqueName: \"kubernetes.io/projected/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-kube-api-access-lq4sr\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:31:38.535774 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:38.535778 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6-bundle\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:31:39.165249 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:39.165212 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" event={"ID":"773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6","Type":"ContainerDied","Data":"152df0db073e8c081e1f0d1dcd0490dd739e87421c4cf65f649f399a475b118a"} Apr 16 23:31:39.165249 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:39.165248 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="152df0db073e8c081e1f0d1dcd0490dd739e87421c4cf65f649f399a475b118a" Apr 16 23:31:39.165453 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:39.165269 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxxxgz" Apr 16 23:31:43.162560 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.162531 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-j4fqx" Apr 16 23:31:43.658793 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.658758 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-rfn9t"] Apr 16 23:31:43.659115 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.659101 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6" containerName="pull" Apr 16 23:31:43.659175 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.659119 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6" containerName="pull" Apr 16 23:31:43.659175 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.659137 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6" containerName="util" Apr 16 23:31:43.659175 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.659142 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6" containerName="util" Apr 16 23:31:43.659175 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.659163 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6" containerName="extract" Apr 16 23:31:43.659175 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.659170 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6" containerName="extract" Apr 16 23:31:43.659326 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.659223 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="773c40b8-ed4f-4b28-84f9-8ccf9bda3fb6" containerName="extract" Apr 16 23:31:43.664592 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.664575 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-rfn9t" Apr 16 23:31:43.666842 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.666821 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-nqkrp\"" Apr 16 23:31:43.668026 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.668002 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-rfn9t"] Apr 16 23:31:43.679190 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.679160 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbxh7\" (UniqueName: \"kubernetes.io/projected/1680e678-b6a0-4e59-8786-f48688ebd24b-kube-api-access-cbxh7\") pod \"cert-manager-759f64656b-rfn9t\" (UID: \"1680e678-b6a0-4e59-8786-f48688ebd24b\") " pod="cert-manager/cert-manager-759f64656b-rfn9t" Apr 16 23:31:43.679305 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.679227 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1680e678-b6a0-4e59-8786-f48688ebd24b-bound-sa-token\") pod \"cert-manager-759f64656b-rfn9t\" (UID: \"1680e678-b6a0-4e59-8786-f48688ebd24b\") " pod="cert-manager/cert-manager-759f64656b-rfn9t" Apr 16 23:31:43.779660 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.779623 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbxh7\" (UniqueName: \"kubernetes.io/projected/1680e678-b6a0-4e59-8786-f48688ebd24b-kube-api-access-cbxh7\") pod \"cert-manager-759f64656b-rfn9t\" (UID: \"1680e678-b6a0-4e59-8786-f48688ebd24b\") " pod="cert-manager/cert-manager-759f64656b-rfn9t" Apr 16 23:31:43.779863 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.779674 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1680e678-b6a0-4e59-8786-f48688ebd24b-bound-sa-token\") pod \"cert-manager-759f64656b-rfn9t\" (UID: \"1680e678-b6a0-4e59-8786-f48688ebd24b\") " pod="cert-manager/cert-manager-759f64656b-rfn9t" Apr 16 23:31:43.787312 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.787286 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1680e678-b6a0-4e59-8786-f48688ebd24b-bound-sa-token\") pod \"cert-manager-759f64656b-rfn9t\" (UID: \"1680e678-b6a0-4e59-8786-f48688ebd24b\") " pod="cert-manager/cert-manager-759f64656b-rfn9t" Apr 16 23:31:43.787417 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.787295 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbxh7\" (UniqueName: \"kubernetes.io/projected/1680e678-b6a0-4e59-8786-f48688ebd24b-kube-api-access-cbxh7\") pod \"cert-manager-759f64656b-rfn9t\" (UID: \"1680e678-b6a0-4e59-8786-f48688ebd24b\") " pod="cert-manager/cert-manager-759f64656b-rfn9t" Apr 16 23:31:43.975018 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:43.974984 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-rfn9t" Apr 16 23:31:44.097962 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:44.097923 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-rfn9t"] Apr 16 23:31:44.102747 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:31:44.102673 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1680e678_b6a0_4e59_8786_f48688ebd24b.slice/crio-5a7cb83e74b0fcc18c20bded2bb9d45d35f928a1046c71dd9c53099c7bceb204 WatchSource:0}: Error finding container 5a7cb83e74b0fcc18c20bded2bb9d45d35f928a1046c71dd9c53099c7bceb204: Status 404 returned error can't find the container with id 5a7cb83e74b0fcc18c20bded2bb9d45d35f928a1046c71dd9c53099c7bceb204 Apr 16 23:31:44.183583 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:44.183551 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-rfn9t" event={"ID":"1680e678-b6a0-4e59-8786-f48688ebd24b","Type":"ContainerStarted","Data":"dd5d07b0405b6736617f6daeb96582041dc48ab67b0a03db9ad6f590b98cf9d5"} Apr 16 23:31:44.183988 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:44.183590 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-rfn9t" event={"ID":"1680e678-b6a0-4e59-8786-f48688ebd24b","Type":"ContainerStarted","Data":"5a7cb83e74b0fcc18c20bded2bb9d45d35f928a1046c71dd9c53099c7bceb204"} Apr 16 23:31:45.202449 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:45.202402 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-rfn9t" podStartSLOduration=2.202383073 podStartE2EDuration="2.202383073s" podCreationTimestamp="2026-04-16 23:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:31:45.201068363 +0000 UTC m=+345.817099842" watchObservedRunningTime="2026-04-16 23:31:45.202383073 +0000 UTC m=+345.818414545" Apr 16 23:31:51.506292 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.506254 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47"] Apr 16 23:31:51.560522 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.560492 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47"] Apr 16 23:31:51.560678 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.560622 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" Apr 16 23:31:51.562944 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.562915 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 23:31:51.562944 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.562927 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 23:31:51.563115 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.563057 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-g6fmt\"" Apr 16 23:31:51.639568 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.639524 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47\" (UID: \"f7272b0e-1af3-49b5-b9a5-d4585cc6be22\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" Apr 16 23:31:51.639747 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.639577 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47\" (UID: \"f7272b0e-1af3-49b5-b9a5-d4585cc6be22\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" Apr 16 23:31:51.639747 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.639613 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2g58\" (UniqueName: \"kubernetes.io/projected/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-kube-api-access-x2g58\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47\" (UID: \"f7272b0e-1af3-49b5-b9a5-d4585cc6be22\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" Apr 16 23:31:51.740688 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.740648 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47\" (UID: \"f7272b0e-1af3-49b5-b9a5-d4585cc6be22\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" Apr 16 23:31:51.740896 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.740737 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47\" (UID: \"f7272b0e-1af3-49b5-b9a5-d4585cc6be22\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" Apr 16 23:31:51.740896 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.740762 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2g58\" (UniqueName: \"kubernetes.io/projected/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-kube-api-access-x2g58\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47\" (UID: \"f7272b0e-1af3-49b5-b9a5-d4585cc6be22\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" Apr 16 23:31:51.741119 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.741095 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47\" (UID: \"f7272b0e-1af3-49b5-b9a5-d4585cc6be22\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" Apr 16 23:31:51.741187 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.741120 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47\" (UID: \"f7272b0e-1af3-49b5-b9a5-d4585cc6be22\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" Apr 16 23:31:51.747680 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.747649 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2g58\" (UniqueName: \"kubernetes.io/projected/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-kube-api-access-x2g58\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47\" (UID: \"f7272b0e-1af3-49b5-b9a5-d4585cc6be22\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" Apr 16 23:31:51.870079 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.869993 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" Apr 16 23:31:51.996215 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:51.996192 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47"] Apr 16 23:31:51.998756 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:31:51.998732 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7272b0e_1af3_49b5_b9a5_d4585cc6be22.slice/crio-5fc0d7c5e90269557bdaef891289fbe7450ce19368e32621abd3747cbc6e8767 WatchSource:0}: Error finding container 5fc0d7c5e90269557bdaef891289fbe7450ce19368e32621abd3747cbc6e8767: Status 404 returned error can't find the container with id 5fc0d7c5e90269557bdaef891289fbe7450ce19368e32621abd3747cbc6e8767 Apr 16 23:31:52.210860 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:52.210828 2564 generic.go:358] "Generic (PLEG): container finished" podID="f7272b0e-1af3-49b5-b9a5-d4585cc6be22" containerID="60e23091912c19629ddaf786f4368cf6c8ff93c74616a10e48a1d66cb972b7c4" exitCode=0 Apr 16 23:31:52.210989 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:52.210885 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" event={"ID":"f7272b0e-1af3-49b5-b9a5-d4585cc6be22","Type":"ContainerDied","Data":"60e23091912c19629ddaf786f4368cf6c8ff93c74616a10e48a1d66cb972b7c4"} Apr 16 23:31:52.210989 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:52.210909 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" event={"ID":"f7272b0e-1af3-49b5-b9a5-d4585cc6be22","Type":"ContainerStarted","Data":"5fc0d7c5e90269557bdaef891289fbe7450ce19368e32621abd3747cbc6e8767"} Apr 16 23:31:53.215528 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:53.215494 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" event={"ID":"f7272b0e-1af3-49b5-b9a5-d4585cc6be22","Type":"ContainerStarted","Data":"9a082e6da2c5a25fb40ffac9a912fd60ed5cc9cf24d9a23b93ee665a87869806"} Apr 16 23:31:54.220217 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:54.220182 2564 generic.go:358] "Generic (PLEG): container finished" podID="f7272b0e-1af3-49b5-b9a5-d4585cc6be22" containerID="9a082e6da2c5a25fb40ffac9a912fd60ed5cc9cf24d9a23b93ee665a87869806" exitCode=0 Apr 16 23:31:54.220586 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:54.220263 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" event={"ID":"f7272b0e-1af3-49b5-b9a5-d4585cc6be22","Type":"ContainerDied","Data":"9a082e6da2c5a25fb40ffac9a912fd60ed5cc9cf24d9a23b93ee665a87869806"} Apr 16 23:31:55.225933 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:55.225896 2564 generic.go:358] "Generic (PLEG): container finished" podID="f7272b0e-1af3-49b5-b9a5-d4585cc6be22" containerID="3f337811a85f72d26289937cc803d1756d62ec0a919fdd7c32c7b6a11469d295" exitCode=0 Apr 16 23:31:55.226323 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:55.225981 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" event={"ID":"f7272b0e-1af3-49b5-b9a5-d4585cc6be22","Type":"ContainerDied","Data":"3f337811a85f72d26289937cc803d1756d62ec0a919fdd7c32c7b6a11469d295"} Apr 16 23:31:56.357433 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:56.357410 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" Apr 16 23:31:56.381227 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:56.381190 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2g58\" (UniqueName: \"kubernetes.io/projected/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-kube-api-access-x2g58\") pod \"f7272b0e-1af3-49b5-b9a5-d4585cc6be22\" (UID: \"f7272b0e-1af3-49b5-b9a5-d4585cc6be22\") " Apr 16 23:31:56.381384 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:56.381255 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-util\") pod \"f7272b0e-1af3-49b5-b9a5-d4585cc6be22\" (UID: \"f7272b0e-1af3-49b5-b9a5-d4585cc6be22\") " Apr 16 23:31:56.381384 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:56.381325 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-bundle\") pod \"f7272b0e-1af3-49b5-b9a5-d4585cc6be22\" (UID: \"f7272b0e-1af3-49b5-b9a5-d4585cc6be22\") " Apr 16 23:31:56.382381 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:56.382349 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-bundle" (OuterVolumeSpecName: "bundle") pod "f7272b0e-1af3-49b5-b9a5-d4585cc6be22" (UID: "f7272b0e-1af3-49b5-b9a5-d4585cc6be22"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:31:56.383457 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:56.383423 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-kube-api-access-x2g58" (OuterVolumeSpecName: "kube-api-access-x2g58") pod "f7272b0e-1af3-49b5-b9a5-d4585cc6be22" (UID: "f7272b0e-1af3-49b5-b9a5-d4585cc6be22"). InnerVolumeSpecName "kube-api-access-x2g58". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:31:56.387437 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:56.387414 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-util" (OuterVolumeSpecName: "util") pod "f7272b0e-1af3-49b5-b9a5-d4585cc6be22" (UID: "f7272b0e-1af3-49b5-b9a5-d4585cc6be22"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:31:56.482240 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:56.482162 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-bundle\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:31:56.482240 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:56.482198 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x2g58\" (UniqueName: \"kubernetes.io/projected/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-kube-api-access-x2g58\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:31:56.482240 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:56.482214 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7272b0e-1af3-49b5-b9a5-d4585cc6be22-util\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:31:57.234219 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:57.234180 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" event={"ID":"f7272b0e-1af3-49b5-b9a5-d4585cc6be22","Type":"ContainerDied","Data":"5fc0d7c5e90269557bdaef891289fbe7450ce19368e32621abd3747cbc6e8767"} Apr 16 23:31:57.234219 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:57.234222 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fc0d7c5e90269557bdaef891289fbe7450ce19368e32621abd3747cbc6e8767" Apr 16 23:31:57.234429 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:31:57.234236 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jkl47" Apr 16 23:32:08.196419 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.196382 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z"] Apr 16 23:32:08.196851 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.196714 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7272b0e-1af3-49b5-b9a5-d4585cc6be22" containerName="pull" Apr 16 23:32:08.196851 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.196725 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7272b0e-1af3-49b5-b9a5-d4585cc6be22" containerName="pull" Apr 16 23:32:08.196851 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.196739 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7272b0e-1af3-49b5-b9a5-d4585cc6be22" containerName="util" Apr 16 23:32:08.196851 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.196744 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7272b0e-1af3-49b5-b9a5-d4585cc6be22" containerName="util" Apr 16 23:32:08.196851 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.196753 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7272b0e-1af3-49b5-b9a5-d4585cc6be22" containerName="extract" Apr 16 23:32:08.196851 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.196758 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7272b0e-1af3-49b5-b9a5-d4585cc6be22" containerName="extract" Apr 16 23:32:08.196851 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.196828 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7272b0e-1af3-49b5-b9a5-d4585cc6be22" containerName="extract" Apr 16 23:32:08.199711 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.199680 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:08.208648 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.205250 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 23:32:08.208648 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.205301 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 23:32:08.208648 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.205554 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 23:32:08.208648 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.205803 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:32:08.208648 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.205851 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 23:32:08.208648 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.206121 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-lk878\"" Apr 16 23:32:08.212261 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.212236 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z"] Apr 16 23:32:08.217798 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.217765 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m"] Apr 16 23:32:08.220925 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.220906 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" Apr 16 23:32:08.225932 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.225908 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 23:32:08.226032 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.225917 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 23:32:08.226124 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.226108 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-v6gnw\"" Apr 16 23:32:08.226233 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.226215 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 23:32:08.226618 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.226481 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 23:32:08.242394 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.242365 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m"] Apr 16 23:32:08.283867 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.283834 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c684f7a7-cc33-43fd-993c-56565eeab0ca-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-9t79m\" (UID: \"c684f7a7-cc33-43fd-993c-56565eeab0ca\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" Apr 16 23:32:08.284035 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.283892 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c684f7a7-cc33-43fd-993c-56565eeab0ca-webhook-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-9t79m\" (UID: \"c684f7a7-cc33-43fd-993c-56565eeab0ca\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" Apr 16 23:32:08.284035 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.283921 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxjv4\" (UniqueName: \"kubernetes.io/projected/ee2cd232-bc5b-489a-a512-2db820b1a069-kube-api-access-dxjv4\") pod \"lws-controller-manager-86bf875fd5-d558z\" (UID: \"ee2cd232-bc5b-489a-a512-2db820b1a069\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:08.284035 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.284023 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee2cd232-bc5b-489a-a512-2db820b1a069-metrics-cert\") pod \"lws-controller-manager-86bf875fd5-d558z\" (UID: \"ee2cd232-bc5b-489a-a512-2db820b1a069\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:08.284215 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.284052 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ee2cd232-bc5b-489a-a512-2db820b1a069-manager-config\") pod \"lws-controller-manager-86bf875fd5-d558z\" (UID: \"ee2cd232-bc5b-489a-a512-2db820b1a069\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:08.284215 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.284075 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqv7r\" (UniqueName: \"kubernetes.io/projected/c684f7a7-cc33-43fd-993c-56565eeab0ca-kube-api-access-mqv7r\") pod \"opendatahub-operator-controller-manager-8bf69b96d-9t79m\" (UID: \"c684f7a7-cc33-43fd-993c-56565eeab0ca\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" Apr 16 23:32:08.284215 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.284097 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee2cd232-bc5b-489a-a512-2db820b1a069-cert\") pod \"lws-controller-manager-86bf875fd5-d558z\" (UID: \"ee2cd232-bc5b-489a-a512-2db820b1a069\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:08.384990 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.384937 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c684f7a7-cc33-43fd-993c-56565eeab0ca-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-9t79m\" (UID: \"c684f7a7-cc33-43fd-993c-56565eeab0ca\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" Apr 16 23:32:08.385361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.385014 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c684f7a7-cc33-43fd-993c-56565eeab0ca-webhook-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-9t79m\" (UID: \"c684f7a7-cc33-43fd-993c-56565eeab0ca\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" Apr 16 23:32:08.385361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.385037 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxjv4\" (UniqueName: \"kubernetes.io/projected/ee2cd232-bc5b-489a-a512-2db820b1a069-kube-api-access-dxjv4\") pod \"lws-controller-manager-86bf875fd5-d558z\" (UID: \"ee2cd232-bc5b-489a-a512-2db820b1a069\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:08.385361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.385091 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee2cd232-bc5b-489a-a512-2db820b1a069-metrics-cert\") pod \"lws-controller-manager-86bf875fd5-d558z\" (UID: \"ee2cd232-bc5b-489a-a512-2db820b1a069\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:08.385361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.385120 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ee2cd232-bc5b-489a-a512-2db820b1a069-manager-config\") pod \"lws-controller-manager-86bf875fd5-d558z\" (UID: \"ee2cd232-bc5b-489a-a512-2db820b1a069\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:08.385361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.385150 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqv7r\" (UniqueName: \"kubernetes.io/projected/c684f7a7-cc33-43fd-993c-56565eeab0ca-kube-api-access-mqv7r\") pod \"opendatahub-operator-controller-manager-8bf69b96d-9t79m\" (UID: \"c684f7a7-cc33-43fd-993c-56565eeab0ca\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" Apr 16 23:32:08.385361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.385176 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee2cd232-bc5b-489a-a512-2db820b1a069-cert\") pod \"lws-controller-manager-86bf875fd5-d558z\" (UID: \"ee2cd232-bc5b-489a-a512-2db820b1a069\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:08.386324 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.386297 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ee2cd232-bc5b-489a-a512-2db820b1a069-manager-config\") pod \"lws-controller-manager-86bf875fd5-d558z\" (UID: \"ee2cd232-bc5b-489a-a512-2db820b1a069\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:08.387969 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.387940 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c684f7a7-cc33-43fd-993c-56565eeab0ca-webhook-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-9t79m\" (UID: \"c684f7a7-cc33-43fd-993c-56565eeab0ca\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" Apr 16 23:32:08.388159 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.388136 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c684f7a7-cc33-43fd-993c-56565eeab0ca-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-9t79m\" (UID: \"c684f7a7-cc33-43fd-993c-56565eeab0ca\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" Apr 16 23:32:08.388366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.388339 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee2cd232-bc5b-489a-a512-2db820b1a069-metrics-cert\") pod \"lws-controller-manager-86bf875fd5-d558z\" (UID: \"ee2cd232-bc5b-489a-a512-2db820b1a069\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:08.388939 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.388913 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee2cd232-bc5b-489a-a512-2db820b1a069-cert\") pod \"lws-controller-manager-86bf875fd5-d558z\" (UID: \"ee2cd232-bc5b-489a-a512-2db820b1a069\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:08.410619 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.410589 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqv7r\" (UniqueName: \"kubernetes.io/projected/c684f7a7-cc33-43fd-993c-56565eeab0ca-kube-api-access-mqv7r\") pod \"opendatahub-operator-controller-manager-8bf69b96d-9t79m\" (UID: \"c684f7a7-cc33-43fd-993c-56565eeab0ca\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" Apr 16 23:32:08.415307 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.415278 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxjv4\" (UniqueName: \"kubernetes.io/projected/ee2cd232-bc5b-489a-a512-2db820b1a069-kube-api-access-dxjv4\") pod \"lws-controller-manager-86bf875fd5-d558z\" (UID: \"ee2cd232-bc5b-489a-a512-2db820b1a069\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:08.515526 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.515493 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:08.532187 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.532154 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" Apr 16 23:32:08.644736 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.644678 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8"] Apr 16 23:32:08.650154 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.650129 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" Apr 16 23:32:08.653070 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.652631 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-g6fmt\"" Apr 16 23:32:08.653070 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.652986 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 23:32:08.653256 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.653233 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 23:32:08.657218 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.657169 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8"] Apr 16 23:32:08.659548 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.659523 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z"] Apr 16 23:32:08.662418 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:32:08.662382 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee2cd232_bc5b_489a_a512_2db820b1a069.slice/crio-f704b0cf89a6011026e3c4c96ad44e7aa861a123df1345c5660a441c4d73a6cf WatchSource:0}: Error finding container f704b0cf89a6011026e3c4c96ad44e7aa861a123df1345c5660a441c4d73a6cf: Status 404 returned error can't find the container with id f704b0cf89a6011026e3c4c96ad44e7aa861a123df1345c5660a441c4d73a6cf Apr 16 23:32:08.690808 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.690775 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m"] Apr 16 23:32:08.693649 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:32:08.693621 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc684f7a7_cc33_43fd_993c_56565eeab0ca.slice/crio-feaeb14738f896deab3f814501ea45c7c740d558580b246dbaa8d980eb13162a WatchSource:0}: Error finding container feaeb14738f896deab3f814501ea45c7c740d558580b246dbaa8d980eb13162a: Status 404 returned error can't find the container with id feaeb14738f896deab3f814501ea45c7c740d558580b246dbaa8d980eb13162a Apr 16 23:32:08.789043 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.788970 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f803605-d64e-453a-9d1a-c29a8cb54363-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8\" (UID: \"1f803605-d64e-453a-9d1a-c29a8cb54363\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" Apr 16 23:32:08.789043 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.789008 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f803605-d64e-453a-9d1a-c29a8cb54363-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8\" (UID: \"1f803605-d64e-453a-9d1a-c29a8cb54363\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" Apr 16 23:32:08.789202 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.789069 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4fv6\" (UniqueName: \"kubernetes.io/projected/1f803605-d64e-453a-9d1a-c29a8cb54363-kube-api-access-b4fv6\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8\" (UID: \"1f803605-d64e-453a-9d1a-c29a8cb54363\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" Apr 16 23:32:08.890226 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.890190 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f803605-d64e-453a-9d1a-c29a8cb54363-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8\" (UID: \"1f803605-d64e-453a-9d1a-c29a8cb54363\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" Apr 16 23:32:08.890387 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.890230 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f803605-d64e-453a-9d1a-c29a8cb54363-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8\" (UID: \"1f803605-d64e-453a-9d1a-c29a8cb54363\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" Apr 16 23:32:08.890387 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.890275 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4fv6\" (UniqueName: \"kubernetes.io/projected/1f803605-d64e-453a-9d1a-c29a8cb54363-kube-api-access-b4fv6\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8\" (UID: \"1f803605-d64e-453a-9d1a-c29a8cb54363\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" Apr 16 23:32:08.890586 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.890565 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f803605-d64e-453a-9d1a-c29a8cb54363-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8\" (UID: \"1f803605-d64e-453a-9d1a-c29a8cb54363\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" Apr 16 23:32:08.890642 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.890619 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f803605-d64e-453a-9d1a-c29a8cb54363-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8\" (UID: \"1f803605-d64e-453a-9d1a-c29a8cb54363\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" Apr 16 23:32:08.899110 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.899079 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4fv6\" (UniqueName: \"kubernetes.io/projected/1f803605-d64e-453a-9d1a-c29a8cb54363-kube-api-access-b4fv6\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8\" (UID: \"1f803605-d64e-453a-9d1a-c29a8cb54363\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" Apr 16 23:32:08.963004 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:08.962972 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" Apr 16 23:32:09.088021 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:09.087989 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8"] Apr 16 23:32:09.090604 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:32:09.090573 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f803605_d64e_453a_9d1a_c29a8cb54363.slice/crio-1c52cc2214a224467374d81de19859c4204be60f0f0cd54c465b690874f89a54 WatchSource:0}: Error finding container 1c52cc2214a224467374d81de19859c4204be60f0f0cd54c465b690874f89a54: Status 404 returned error can't find the container with id 1c52cc2214a224467374d81de19859c4204be60f0f0cd54c465b690874f89a54 Apr 16 23:32:09.281570 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:09.281529 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" event={"ID":"ee2cd232-bc5b-489a-a512-2db820b1a069","Type":"ContainerStarted","Data":"f704b0cf89a6011026e3c4c96ad44e7aa861a123df1345c5660a441c4d73a6cf"} Apr 16 23:32:09.283186 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:09.283141 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" event={"ID":"c684f7a7-cc33-43fd-993c-56565eeab0ca","Type":"ContainerStarted","Data":"feaeb14738f896deab3f814501ea45c7c740d558580b246dbaa8d980eb13162a"} Apr 16 23:32:09.285076 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:09.285048 2564 generic.go:358] "Generic (PLEG): container finished" podID="1f803605-d64e-453a-9d1a-c29a8cb54363" containerID="c040d6cf93b0dc5c5f253ecee614573a898d71a36848ba4ae46a675f254671b8" exitCode=0 Apr 16 23:32:09.285215 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:09.285153 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" event={"ID":"1f803605-d64e-453a-9d1a-c29a8cb54363","Type":"ContainerDied","Data":"c040d6cf93b0dc5c5f253ecee614573a898d71a36848ba4ae46a675f254671b8"} Apr 16 23:32:09.285215 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:09.285175 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" event={"ID":"1f803605-d64e-453a-9d1a-c29a8cb54363","Type":"ContainerStarted","Data":"1c52cc2214a224467374d81de19859c4204be60f0f0cd54c465b690874f89a54"} Apr 16 23:32:12.299313 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:12.299270 2564 generic.go:358] "Generic (PLEG): container finished" podID="1f803605-d64e-453a-9d1a-c29a8cb54363" containerID="8c70bb168596212cc9bca9767bee7888272b77217bc47ba897971f706f2e45ba" exitCode=0 Apr 16 23:32:12.299844 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:12.299369 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" event={"ID":"1f803605-d64e-453a-9d1a-c29a8cb54363","Type":"ContainerDied","Data":"8c70bb168596212cc9bca9767bee7888272b77217bc47ba897971f706f2e45ba"} Apr 16 23:32:12.301101 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:12.301068 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" event={"ID":"ee2cd232-bc5b-489a-a512-2db820b1a069","Type":"ContainerStarted","Data":"b7a278bd5e22e3486200e69919190f9bcca4904d2497458e24e3366d9882323b"} Apr 16 23:32:12.301281 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:12.301252 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:12.302629 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:12.302605 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" event={"ID":"c684f7a7-cc33-43fd-993c-56565eeab0ca","Type":"ContainerStarted","Data":"d50511f3ee6f93b4ea0fee5d1526beec647660c27d251617d4edd4ddb6743e98"} Apr 16 23:32:12.304870 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:12.304850 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" Apr 16 23:32:12.335964 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:12.335918 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" podStartSLOduration=1.388138772 podStartE2EDuration="4.335904716s" podCreationTimestamp="2026-04-16 23:32:08 +0000 UTC" firstStartedPulling="2026-04-16 23:32:08.664675181 +0000 UTC m=+369.280706639" lastFinishedPulling="2026-04-16 23:32:11.612441107 +0000 UTC m=+372.228472583" observedRunningTime="2026-04-16 23:32:12.334584241 +0000 UTC m=+372.950615721" watchObservedRunningTime="2026-04-16 23:32:12.335904716 +0000 UTC m=+372.951936195" Apr 16 23:32:12.361165 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:12.361120 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" podStartSLOduration=1.4443188359999999 podStartE2EDuration="4.361104619s" podCreationTimestamp="2026-04-16 23:32:08 +0000 UTC" firstStartedPulling="2026-04-16 23:32:08.695403317 +0000 UTC m=+369.311434774" lastFinishedPulling="2026-04-16 23:32:11.612189085 +0000 UTC m=+372.228220557" observedRunningTime="2026-04-16 23:32:12.358880194 +0000 UTC m=+372.974911674" watchObservedRunningTime="2026-04-16 23:32:12.361104619 +0000 UTC m=+372.977136097" Apr 16 23:32:13.308226 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:13.308175 2564 generic.go:358] "Generic (PLEG): container finished" podID="1f803605-d64e-453a-9d1a-c29a8cb54363" containerID="2e24fc2aa5388696871d106f9ad10b31fa252b05dede8898d2fa5327a9d01210" exitCode=0 Apr 16 23:32:13.308626 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:13.308263 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" event={"ID":"1f803605-d64e-453a-9d1a-c29a8cb54363","Type":"ContainerDied","Data":"2e24fc2aa5388696871d106f9ad10b31fa252b05dede8898d2fa5327a9d01210"} Apr 16 23:32:14.433907 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:14.433884 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" Apr 16 23:32:14.539726 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:14.539679 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4fv6\" (UniqueName: \"kubernetes.io/projected/1f803605-d64e-453a-9d1a-c29a8cb54363-kube-api-access-b4fv6\") pod \"1f803605-d64e-453a-9d1a-c29a8cb54363\" (UID: \"1f803605-d64e-453a-9d1a-c29a8cb54363\") " Apr 16 23:32:14.539726 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:14.539728 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f803605-d64e-453a-9d1a-c29a8cb54363-util\") pod \"1f803605-d64e-453a-9d1a-c29a8cb54363\" (UID: \"1f803605-d64e-453a-9d1a-c29a8cb54363\") " Apr 16 23:32:14.539927 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:14.539777 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f803605-d64e-453a-9d1a-c29a8cb54363-bundle\") pod \"1f803605-d64e-453a-9d1a-c29a8cb54363\" (UID: \"1f803605-d64e-453a-9d1a-c29a8cb54363\") " Apr 16 23:32:14.540632 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:14.540595 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f803605-d64e-453a-9d1a-c29a8cb54363-bundle" (OuterVolumeSpecName: "bundle") pod "1f803605-d64e-453a-9d1a-c29a8cb54363" (UID: "1f803605-d64e-453a-9d1a-c29a8cb54363"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:32:14.541751 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:14.541690 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f803605-d64e-453a-9d1a-c29a8cb54363-kube-api-access-b4fv6" (OuterVolumeSpecName: "kube-api-access-b4fv6") pod "1f803605-d64e-453a-9d1a-c29a8cb54363" (UID: "1f803605-d64e-453a-9d1a-c29a8cb54363"). InnerVolumeSpecName "kube-api-access-b4fv6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:32:14.641033 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:14.640932 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b4fv6\" (UniqueName: \"kubernetes.io/projected/1f803605-d64e-453a-9d1a-c29a8cb54363-kube-api-access-b4fv6\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:32:14.641033 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:14.640964 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f803605-d64e-453a-9d1a-c29a8cb54363-bundle\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:32:14.692045 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:14.692004 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f803605-d64e-453a-9d1a-c29a8cb54363-util" (OuterVolumeSpecName: "util") pod "1f803605-d64e-453a-9d1a-c29a8cb54363" (UID: "1f803605-d64e-453a-9d1a-c29a8cb54363"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:32:14.741518 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:14.741489 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f803605-d64e-453a-9d1a-c29a8cb54363-util\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:32:15.316905 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:15.316869 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" event={"ID":"1f803605-d64e-453a-9d1a-c29a8cb54363","Type":"ContainerDied","Data":"1c52cc2214a224467374d81de19859c4204be60f0f0cd54c465b690874f89a54"} Apr 16 23:32:15.316905 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:15.316895 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9w66s8" Apr 16 23:32:15.316905 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:15.316911 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c52cc2214a224467374d81de19859c4204be60f0f0cd54c465b690874f89a54" Apr 16 23:32:23.310428 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:23.310391 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-d558z" Apr 16 23:32:23.311006 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:23.310556 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-9t79m" Apr 16 23:32:26.308324 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.308283 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-577868f455-sqcr4"] Apr 16 23:32:26.308858 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.308811 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f803605-d64e-453a-9d1a-c29a8cb54363" containerName="util" Apr 16 23:32:26.308858 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.308831 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f803605-d64e-453a-9d1a-c29a8cb54363" containerName="util" Apr 16 23:32:26.308858 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.308843 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f803605-d64e-453a-9d1a-c29a8cb54363" containerName="extract" Apr 16 23:32:26.308858 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.308849 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f803605-d64e-453a-9d1a-c29a8cb54363" containerName="extract" Apr 16 23:32:26.309014 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.308860 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f803605-d64e-453a-9d1a-c29a8cb54363" containerName="pull" Apr 16 23:32:26.309014 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.308868 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f803605-d64e-453a-9d1a-c29a8cb54363" containerName="pull" Apr 16 23:32:26.309014 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.308956 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f803605-d64e-453a-9d1a-c29a8cb54363" containerName="extract" Apr 16 23:32:26.315373 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.315351 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-577868f455-sqcr4" Apr 16 23:32:26.317531 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.317504 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 23:32:26.317665 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.317645 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 23:32:26.318545 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.318519 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 23:32:26.318652 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.318585 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-245l5\"" Apr 16 23:32:26.318652 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.318607 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 23:32:26.322571 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.322548 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-577868f455-sqcr4"] Apr 16 23:32:26.445908 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.445873 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4767bdaf-3083-4960-9ed8-e4ad26613b2d-tmp\") pod \"kube-auth-proxy-577868f455-sqcr4\" (UID: \"4767bdaf-3083-4960-9ed8-e4ad26613b2d\") " pod="openshift-ingress/kube-auth-proxy-577868f455-sqcr4" Apr 16 23:32:26.446079 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.445948 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhfwh\" (UniqueName: \"kubernetes.io/projected/4767bdaf-3083-4960-9ed8-e4ad26613b2d-kube-api-access-zhfwh\") pod \"kube-auth-proxy-577868f455-sqcr4\" (UID: \"4767bdaf-3083-4960-9ed8-e4ad26613b2d\") " pod="openshift-ingress/kube-auth-proxy-577868f455-sqcr4" Apr 16 23:32:26.446079 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.445991 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4767bdaf-3083-4960-9ed8-e4ad26613b2d-tls-certs\") pod \"kube-auth-proxy-577868f455-sqcr4\" (UID: \"4767bdaf-3083-4960-9ed8-e4ad26613b2d\") " pod="openshift-ingress/kube-auth-proxy-577868f455-sqcr4" Apr 16 23:32:26.546742 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.546685 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4767bdaf-3083-4960-9ed8-e4ad26613b2d-tmp\") pod \"kube-auth-proxy-577868f455-sqcr4\" (UID: \"4767bdaf-3083-4960-9ed8-e4ad26613b2d\") " pod="openshift-ingress/kube-auth-proxy-577868f455-sqcr4" Apr 16 23:32:26.546926 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.546771 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhfwh\" (UniqueName: \"kubernetes.io/projected/4767bdaf-3083-4960-9ed8-e4ad26613b2d-kube-api-access-zhfwh\") pod \"kube-auth-proxy-577868f455-sqcr4\" (UID: \"4767bdaf-3083-4960-9ed8-e4ad26613b2d\") " pod="openshift-ingress/kube-auth-proxy-577868f455-sqcr4" Apr 16 23:32:26.546926 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.546805 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4767bdaf-3083-4960-9ed8-e4ad26613b2d-tls-certs\") pod \"kube-auth-proxy-577868f455-sqcr4\" (UID: \"4767bdaf-3083-4960-9ed8-e4ad26613b2d\") " pod="openshift-ingress/kube-auth-proxy-577868f455-sqcr4" Apr 16 23:32:26.548932 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.548895 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4767bdaf-3083-4960-9ed8-e4ad26613b2d-tmp\") pod \"kube-auth-proxy-577868f455-sqcr4\" (UID: \"4767bdaf-3083-4960-9ed8-e4ad26613b2d\") " pod="openshift-ingress/kube-auth-proxy-577868f455-sqcr4" Apr 16 23:32:26.549193 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.549171 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4767bdaf-3083-4960-9ed8-e4ad26613b2d-tls-certs\") pod \"kube-auth-proxy-577868f455-sqcr4\" (UID: \"4767bdaf-3083-4960-9ed8-e4ad26613b2d\") " pod="openshift-ingress/kube-auth-proxy-577868f455-sqcr4" Apr 16 23:32:26.553799 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.553781 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhfwh\" (UniqueName: \"kubernetes.io/projected/4767bdaf-3083-4960-9ed8-e4ad26613b2d-kube-api-access-zhfwh\") pod \"kube-auth-proxy-577868f455-sqcr4\" (UID: \"4767bdaf-3083-4960-9ed8-e4ad26613b2d\") " pod="openshift-ingress/kube-auth-proxy-577868f455-sqcr4" Apr 16 23:32:26.626271 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.626183 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-577868f455-sqcr4" Apr 16 23:32:26.751604 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:26.751471 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-577868f455-sqcr4"] Apr 16 23:32:26.753982 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:32:26.753955 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4767bdaf_3083_4960_9ed8_e4ad26613b2d.slice/crio-3c40925db772b2368f37e0948bebcd497a7b797263c96b9e4cdcf51b0aa173a3 WatchSource:0}: Error finding container 3c40925db772b2368f37e0948bebcd497a7b797263c96b9e4cdcf51b0aa173a3: Status 404 returned error can't find the container with id 3c40925db772b2368f37e0948bebcd497a7b797263c96b9e4cdcf51b0aa173a3 Apr 16 23:32:27.358690 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:27.358650 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-577868f455-sqcr4" event={"ID":"4767bdaf-3083-4960-9ed8-e4ad26613b2d","Type":"ContainerStarted","Data":"3c40925db772b2368f37e0948bebcd497a7b797263c96b9e4cdcf51b0aa173a3"} Apr 16 23:32:28.026046 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.026008 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r"] Apr 16 23:32:28.030569 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.030541 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" Apr 16 23:32:28.032911 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.032880 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 23:32:28.033222 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.033169 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-g6fmt\"" Apr 16 23:32:28.034187 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.033498 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 23:32:28.036340 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.036314 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r"] Apr 16 23:32:28.160219 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.160185 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m85w7\" (UniqueName: \"kubernetes.io/projected/b467c093-f959-4334-a766-597942ebded8-kube-api-access-m85w7\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r\" (UID: \"b467c093-f959-4334-a766-597942ebded8\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" Apr 16 23:32:28.160417 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.160249 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b467c093-f959-4334-a766-597942ebded8-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r\" (UID: \"b467c093-f959-4334-a766-597942ebded8\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" Apr 16 23:32:28.160490 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.160412 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b467c093-f959-4334-a766-597942ebded8-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r\" (UID: \"b467c093-f959-4334-a766-597942ebded8\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" Apr 16 23:32:28.261082 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.261046 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b467c093-f959-4334-a766-597942ebded8-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r\" (UID: \"b467c093-f959-4334-a766-597942ebded8\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" Apr 16 23:32:28.261263 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.261134 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m85w7\" (UniqueName: \"kubernetes.io/projected/b467c093-f959-4334-a766-597942ebded8-kube-api-access-m85w7\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r\" (UID: \"b467c093-f959-4334-a766-597942ebded8\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" Apr 16 23:32:28.261263 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.261181 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b467c093-f959-4334-a766-597942ebded8-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r\" (UID: \"b467c093-f959-4334-a766-597942ebded8\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" Apr 16 23:32:28.261462 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.261438 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b467c093-f959-4334-a766-597942ebded8-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r\" (UID: \"b467c093-f959-4334-a766-597942ebded8\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" Apr 16 23:32:28.261531 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.261485 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b467c093-f959-4334-a766-597942ebded8-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r\" (UID: \"b467c093-f959-4334-a766-597942ebded8\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" Apr 16 23:32:28.268897 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.268869 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m85w7\" (UniqueName: \"kubernetes.io/projected/b467c093-f959-4334-a766-597942ebded8-kube-api-access-m85w7\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r\" (UID: \"b467c093-f959-4334-a766-597942ebded8\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" Apr 16 23:32:28.343491 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.343411 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" Apr 16 23:32:28.795280 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:28.795215 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r"] Apr 16 23:32:28.797335 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:32:28.797306 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb467c093_f959_4334_a766_597942ebded8.slice/crio-929dc04058940ed6ec27a8ee2fbf93ae1aef86d8d06172b72c9508e8a201e5b7 WatchSource:0}: Error finding container 929dc04058940ed6ec27a8ee2fbf93ae1aef86d8d06172b72c9508e8a201e5b7: Status 404 returned error can't find the container with id 929dc04058940ed6ec27a8ee2fbf93ae1aef86d8d06172b72c9508e8a201e5b7 Apr 16 23:32:29.368128 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:29.368091 2564 generic.go:358] "Generic (PLEG): container finished" podID="b467c093-f959-4334-a766-597942ebded8" containerID="60e26719e1a69667af613f3328152f19d8d4f24e7b00697ac757c99f50529a08" exitCode=0 Apr 16 23:32:29.368311 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:29.368154 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" event={"ID":"b467c093-f959-4334-a766-597942ebded8","Type":"ContainerDied","Data":"60e26719e1a69667af613f3328152f19d8d4f24e7b00697ac757c99f50529a08"} Apr 16 23:32:29.368311 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:29.368182 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" event={"ID":"b467c093-f959-4334-a766-597942ebded8","Type":"ContainerStarted","Data":"929dc04058940ed6ec27a8ee2fbf93ae1aef86d8d06172b72c9508e8a201e5b7"} Apr 16 23:32:30.374642 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:30.374605 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-577868f455-sqcr4" event={"ID":"4767bdaf-3083-4960-9ed8-e4ad26613b2d","Type":"ContainerStarted","Data":"09beb929ef5fb29badaf8cd802c9cc28f447f10c89d18481ffecf44b161462c0"} Apr 16 23:32:30.376288 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:30.376265 2564 generic.go:358] "Generic (PLEG): container finished" podID="b467c093-f959-4334-a766-597942ebded8" containerID="7ef43489e960d84b4404f7819a5915170854ec80adcdef7c8ddc391c9a910f84" exitCode=0 Apr 16 23:32:30.376364 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:30.376308 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" event={"ID":"b467c093-f959-4334-a766-597942ebded8","Type":"ContainerDied","Data":"7ef43489e960d84b4404f7819a5915170854ec80adcdef7c8ddc391c9a910f84"} Apr 16 23:32:30.388845 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:30.388669 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-577868f455-sqcr4" podStartSLOduration=1.257741505 podStartE2EDuration="4.388652687s" podCreationTimestamp="2026-04-16 23:32:26 +0000 UTC" firstStartedPulling="2026-04-16 23:32:26.755746516 +0000 UTC m=+387.371777977" lastFinishedPulling="2026-04-16 23:32:29.886657699 +0000 UTC m=+390.502689159" observedRunningTime="2026-04-16 23:32:30.387861555 +0000 UTC m=+391.003893034" watchObservedRunningTime="2026-04-16 23:32:30.388652687 +0000 UTC m=+391.004684189" Apr 16 23:32:31.383091 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:31.383053 2564 generic.go:358] "Generic (PLEG): container finished" podID="b467c093-f959-4334-a766-597942ebded8" containerID="5a511ad463a305d0a721ca2a4e139910b5d3d0c8f063a541d2c621746b93b2ea" exitCode=0 Apr 16 23:32:31.383501 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:31.383141 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" event={"ID":"b467c093-f959-4334-a766-597942ebded8","Type":"ContainerDied","Data":"5a511ad463a305d0a721ca2a4e139910b5d3d0c8f063a541d2c621746b93b2ea"} Apr 16 23:32:32.523924 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:32.523901 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" Apr 16 23:32:32.701816 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:32.701728 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m85w7\" (UniqueName: \"kubernetes.io/projected/b467c093-f959-4334-a766-597942ebded8-kube-api-access-m85w7\") pod \"b467c093-f959-4334-a766-597942ebded8\" (UID: \"b467c093-f959-4334-a766-597942ebded8\") " Apr 16 23:32:32.701816 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:32.701817 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b467c093-f959-4334-a766-597942ebded8-util\") pod \"b467c093-f959-4334-a766-597942ebded8\" (UID: \"b467c093-f959-4334-a766-597942ebded8\") " Apr 16 23:32:32.702043 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:32.701846 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b467c093-f959-4334-a766-597942ebded8-bundle\") pod \"b467c093-f959-4334-a766-597942ebded8\" (UID: \"b467c093-f959-4334-a766-597942ebded8\") " Apr 16 23:32:32.702732 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:32.702682 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b467c093-f959-4334-a766-597942ebded8-bundle" (OuterVolumeSpecName: "bundle") pod "b467c093-f959-4334-a766-597942ebded8" (UID: "b467c093-f959-4334-a766-597942ebded8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:32:32.703866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:32.703836 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b467c093-f959-4334-a766-597942ebded8-kube-api-access-m85w7" (OuterVolumeSpecName: "kube-api-access-m85w7") pod "b467c093-f959-4334-a766-597942ebded8" (UID: "b467c093-f959-4334-a766-597942ebded8"). InnerVolumeSpecName "kube-api-access-m85w7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:32:32.707101 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:32.707075 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b467c093-f959-4334-a766-597942ebded8-util" (OuterVolumeSpecName: "util") pod "b467c093-f959-4334-a766-597942ebded8" (UID: "b467c093-f959-4334-a766-597942ebded8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:32:32.802452 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:32.802416 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b467c093-f959-4334-a766-597942ebded8-util\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:32:32.802452 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:32.802444 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b467c093-f959-4334-a766-597942ebded8-bundle\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:32:32.802452 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:32.802454 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m85w7\" (UniqueName: \"kubernetes.io/projected/b467c093-f959-4334-a766-597942ebded8-kube-api-access-m85w7\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:32:33.392079 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:33.392044 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" event={"ID":"b467c093-f959-4334-a766-597942ebded8","Type":"ContainerDied","Data":"929dc04058940ed6ec27a8ee2fbf93ae1aef86d8d06172b72c9508e8a201e5b7"} Apr 16 23:32:33.392079 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:33.392079 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="929dc04058940ed6ec27a8ee2fbf93ae1aef86d8d06172b72c9508e8a201e5b7" Apr 16 23:32:33.392079 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:33.392086 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zjh6r" Apr 16 23:32:37.520639 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.520600 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85"] Apr 16 23:32:37.521189 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.521111 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b467c093-f959-4334-a766-597942ebded8" containerName="util" Apr 16 23:32:37.521189 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.521130 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="b467c093-f959-4334-a766-597942ebded8" containerName="util" Apr 16 23:32:37.521189 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.521146 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b467c093-f959-4334-a766-597942ebded8" containerName="extract" Apr 16 23:32:37.521189 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.521156 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="b467c093-f959-4334-a766-597942ebded8" containerName="extract" Apr 16 23:32:37.521189 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.521180 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b467c093-f959-4334-a766-597942ebded8" containerName="pull" Apr 16 23:32:37.521189 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.521189 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="b467c093-f959-4334-a766-597942ebded8" containerName="pull" Apr 16 23:32:37.521502 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.521273 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="b467c093-f959-4334-a766-597942ebded8" containerName="extract" Apr 16 23:32:37.524361 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.524337 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" Apr 16 23:32:37.529460 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.529437 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-g6fmt\"" Apr 16 23:32:37.531584 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.531557 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 23:32:37.531881 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.531862 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 23:32:37.537912 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.537885 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g7hv\" (UniqueName: \"kubernetes.io/projected/986b2ecf-7a1e-495a-9315-6dedad16b855-kube-api-access-7g7hv\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85\" (UID: \"986b2ecf-7a1e-495a-9315-6dedad16b855\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" Apr 16 23:32:37.538042 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.537934 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/986b2ecf-7a1e-495a-9315-6dedad16b855-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85\" (UID: \"986b2ecf-7a1e-495a-9315-6dedad16b855\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" Apr 16 23:32:37.538107 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.538087 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/986b2ecf-7a1e-495a-9315-6dedad16b855-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85\" (UID: \"986b2ecf-7a1e-495a-9315-6dedad16b855\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" Apr 16 23:32:37.546446 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.546420 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85"] Apr 16 23:32:37.639247 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.639212 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/986b2ecf-7a1e-495a-9315-6dedad16b855-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85\" (UID: \"986b2ecf-7a1e-495a-9315-6dedad16b855\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" Apr 16 23:32:37.639455 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.639291 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7g7hv\" (UniqueName: \"kubernetes.io/projected/986b2ecf-7a1e-495a-9315-6dedad16b855-kube-api-access-7g7hv\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85\" (UID: \"986b2ecf-7a1e-495a-9315-6dedad16b855\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" Apr 16 23:32:37.639455 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.639362 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/986b2ecf-7a1e-495a-9315-6dedad16b855-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85\" (UID: \"986b2ecf-7a1e-495a-9315-6dedad16b855\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" Apr 16 23:32:37.639685 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.639661 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/986b2ecf-7a1e-495a-9315-6dedad16b855-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85\" (UID: \"986b2ecf-7a1e-495a-9315-6dedad16b855\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" Apr 16 23:32:37.639880 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.639772 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/986b2ecf-7a1e-495a-9315-6dedad16b855-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85\" (UID: \"986b2ecf-7a1e-495a-9315-6dedad16b855\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" Apr 16 23:32:37.654407 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.654381 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g7hv\" (UniqueName: \"kubernetes.io/projected/986b2ecf-7a1e-495a-9315-6dedad16b855-kube-api-access-7g7hv\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85\" (UID: \"986b2ecf-7a1e-495a-9315-6dedad16b855\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" Apr 16 23:32:37.838511 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.838422 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" Apr 16 23:32:37.996490 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:37.996459 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85"] Apr 16 23:32:37.997731 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:32:37.997686 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod986b2ecf_7a1e_495a_9315_6dedad16b855.slice/crio-cf278e48ca97cd87aa4dffd66808d2dfcc85e5ab0899c37f0a8e1e616c7ea93c WatchSource:0}: Error finding container cf278e48ca97cd87aa4dffd66808d2dfcc85e5ab0899c37f0a8e1e616c7ea93c: Status 404 returned error can't find the container with id cf278e48ca97cd87aa4dffd66808d2dfcc85e5ab0899c37f0a8e1e616c7ea93c Apr 16 23:32:38.414860 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:38.414733 2564 generic.go:358] "Generic (PLEG): container finished" podID="986b2ecf-7a1e-495a-9315-6dedad16b855" containerID="31c5409ba0c6c0d502513a889658d8650726f56fc835b8c20ba0ea2c27c8ec7c" exitCode=0 Apr 16 23:32:38.414860 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:38.414832 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" event={"ID":"986b2ecf-7a1e-495a-9315-6dedad16b855","Type":"ContainerDied","Data":"31c5409ba0c6c0d502513a889658d8650726f56fc835b8c20ba0ea2c27c8ec7c"} Apr 16 23:32:38.415214 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:38.414872 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" event={"ID":"986b2ecf-7a1e-495a-9315-6dedad16b855","Type":"ContainerStarted","Data":"cf278e48ca97cd87aa4dffd66808d2dfcc85e5ab0899c37f0a8e1e616c7ea93c"} Apr 16 23:32:39.420339 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:39.420298 2564 generic.go:358] "Generic (PLEG): container finished" podID="986b2ecf-7a1e-495a-9315-6dedad16b855" containerID="05ea2bf302a71fb5400cd2d7bff626b609ad27df253be8de49fce7a8f2cdd0d6" exitCode=0 Apr 16 23:32:39.420744 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:39.420389 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" event={"ID":"986b2ecf-7a1e-495a-9315-6dedad16b855","Type":"ContainerDied","Data":"05ea2bf302a71fb5400cd2d7bff626b609ad27df253be8de49fce7a8f2cdd0d6"} Apr 16 23:32:40.426239 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:40.426205 2564 generic.go:358] "Generic (PLEG): container finished" podID="986b2ecf-7a1e-495a-9315-6dedad16b855" containerID="a083f235fb88f84d6f27060676a06cd0307ce4600634f993abf7162df4105cca" exitCode=0 Apr 16 23:32:40.426630 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:40.426301 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" event={"ID":"986b2ecf-7a1e-495a-9315-6dedad16b855","Type":"ContainerDied","Data":"a083f235fb88f84d6f27060676a06cd0307ce4600634f993abf7162df4105cca"} Apr 16 23:32:41.554102 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:41.554078 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" Apr 16 23:32:41.568506 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:41.568479 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/986b2ecf-7a1e-495a-9315-6dedad16b855-bundle\") pod \"986b2ecf-7a1e-495a-9315-6dedad16b855\" (UID: \"986b2ecf-7a1e-495a-9315-6dedad16b855\") " Apr 16 23:32:41.568633 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:41.568543 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g7hv\" (UniqueName: \"kubernetes.io/projected/986b2ecf-7a1e-495a-9315-6dedad16b855-kube-api-access-7g7hv\") pod \"986b2ecf-7a1e-495a-9315-6dedad16b855\" (UID: \"986b2ecf-7a1e-495a-9315-6dedad16b855\") " Apr 16 23:32:41.568633 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:41.568571 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/986b2ecf-7a1e-495a-9315-6dedad16b855-util\") pod \"986b2ecf-7a1e-495a-9315-6dedad16b855\" (UID: \"986b2ecf-7a1e-495a-9315-6dedad16b855\") " Apr 16 23:32:41.570579 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:41.570548 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/986b2ecf-7a1e-495a-9315-6dedad16b855-bundle" (OuterVolumeSpecName: "bundle") pod "986b2ecf-7a1e-495a-9315-6dedad16b855" (UID: "986b2ecf-7a1e-495a-9315-6dedad16b855"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:32:41.571362 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:41.571333 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/986b2ecf-7a1e-495a-9315-6dedad16b855-kube-api-access-7g7hv" (OuterVolumeSpecName: "kube-api-access-7g7hv") pod "986b2ecf-7a1e-495a-9315-6dedad16b855" (UID: "986b2ecf-7a1e-495a-9315-6dedad16b855"). InnerVolumeSpecName "kube-api-access-7g7hv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:32:41.576887 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:41.576863 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/986b2ecf-7a1e-495a-9315-6dedad16b855-util" (OuterVolumeSpecName: "util") pod "986b2ecf-7a1e-495a-9315-6dedad16b855" (UID: "986b2ecf-7a1e-495a-9315-6dedad16b855"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:32:41.669270 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:41.669237 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/986b2ecf-7a1e-495a-9315-6dedad16b855-bundle\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:32:41.669270 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:41.669267 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7g7hv\" (UniqueName: \"kubernetes.io/projected/986b2ecf-7a1e-495a-9315-6dedad16b855-kube-api-access-7g7hv\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:32:41.669270 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:41.669276 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/986b2ecf-7a1e-495a-9315-6dedad16b855-util\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:32:42.435992 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:42.435907 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" event={"ID":"986b2ecf-7a1e-495a-9315-6dedad16b855","Type":"ContainerDied","Data":"cf278e48ca97cd87aa4dffd66808d2dfcc85e5ab0899c37f0a8e1e616c7ea93c"} Apr 16 23:32:42.435992 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:42.435931 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2hnd85" Apr 16 23:32:42.435992 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:32:42.435943 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf278e48ca97cd87aa4dffd66808d2dfcc85e5ab0899c37f0a8e1e616c7ea93c" Apr 16 23:33:42.671527 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.671490 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8"] Apr 16 23:33:42.672068 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.671841 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="986b2ecf-7a1e-495a-9315-6dedad16b855" containerName="pull" Apr 16 23:33:42.672068 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.671853 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="986b2ecf-7a1e-495a-9315-6dedad16b855" containerName="pull" Apr 16 23:33:42.672068 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.671868 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="986b2ecf-7a1e-495a-9315-6dedad16b855" containerName="util" Apr 16 23:33:42.672068 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.671873 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="986b2ecf-7a1e-495a-9315-6dedad16b855" containerName="util" Apr 16 23:33:42.672068 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.671890 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="986b2ecf-7a1e-495a-9315-6dedad16b855" containerName="extract" Apr 16 23:33:42.672068 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.671896 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="986b2ecf-7a1e-495a-9315-6dedad16b855" containerName="extract" Apr 16 23:33:42.672068 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.671946 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="986b2ecf-7a1e-495a-9315-6dedad16b855" containerName="extract" Apr 16 23:33:42.675119 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.675100 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" Apr 16 23:33:42.677298 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.677278 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 23:33:42.678234 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.678188 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 23:33:42.678371 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.678339 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-26zbs\"" Apr 16 23:33:42.680653 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.680627 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8"] Apr 16 23:33:42.785815 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.785782 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/797c63f2-40b6-41f4-95ec-c57ed738695a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8\" (UID: \"797c63f2-40b6-41f4-95ec-c57ed738695a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" Apr 16 23:33:42.785984 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.785831 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/797c63f2-40b6-41f4-95ec-c57ed738695a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8\" (UID: \"797c63f2-40b6-41f4-95ec-c57ed738695a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" Apr 16 23:33:42.785984 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.785910 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz4xv\" (UniqueName: \"kubernetes.io/projected/797c63f2-40b6-41f4-95ec-c57ed738695a-kube-api-access-bz4xv\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8\" (UID: \"797c63f2-40b6-41f4-95ec-c57ed738695a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" Apr 16 23:33:42.887298 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.887263 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/797c63f2-40b6-41f4-95ec-c57ed738695a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8\" (UID: \"797c63f2-40b6-41f4-95ec-c57ed738695a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" Apr 16 23:33:42.887460 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.887432 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bz4xv\" (UniqueName: \"kubernetes.io/projected/797c63f2-40b6-41f4-95ec-c57ed738695a-kube-api-access-bz4xv\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8\" (UID: \"797c63f2-40b6-41f4-95ec-c57ed738695a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" Apr 16 23:33:42.887551 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.887532 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/797c63f2-40b6-41f4-95ec-c57ed738695a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8\" (UID: \"797c63f2-40b6-41f4-95ec-c57ed738695a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" Apr 16 23:33:42.887767 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.887743 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/797c63f2-40b6-41f4-95ec-c57ed738695a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8\" (UID: \"797c63f2-40b6-41f4-95ec-c57ed738695a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" Apr 16 23:33:42.887879 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.887865 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/797c63f2-40b6-41f4-95ec-c57ed738695a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8\" (UID: \"797c63f2-40b6-41f4-95ec-c57ed738695a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" Apr 16 23:33:42.896529 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.896500 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz4xv\" (UniqueName: \"kubernetes.io/projected/797c63f2-40b6-41f4-95ec-c57ed738695a-kube-api-access-bz4xv\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8\" (UID: \"797c63f2-40b6-41f4-95ec-c57ed738695a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" Apr 16 23:33:42.985807 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:42.985773 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" Apr 16 23:33:43.110230 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.110201 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8"] Apr 16 23:33:43.111981 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:33:43.111955 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod797c63f2_40b6_41f4_95ec_c57ed738695a.slice/crio-1f62f157705cafc1c95dfbbc03d0b8a60a6f9768b1efac3602483977c3a159ff WatchSource:0}: Error finding container 1f62f157705cafc1c95dfbbc03d0b8a60a6f9768b1efac3602483977c3a159ff: Status 404 returned error can't find the container with id 1f62f157705cafc1c95dfbbc03d0b8a60a6f9768b1efac3602483977c3a159ff Apr 16 23:33:43.267321 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.267250 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg"] Apr 16 23:33:43.270660 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.270640 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" Apr 16 23:33:43.278756 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.278731 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg"] Apr 16 23:33:43.291269 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.291238 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67a674cd-ba01-41f4-877c-55adb3dd845c-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg\" (UID: \"67a674cd-ba01-41f4-877c-55adb3dd845c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" Apr 16 23:33:43.291269 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.291273 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2txz\" (UniqueName: \"kubernetes.io/projected/67a674cd-ba01-41f4-877c-55adb3dd845c-kube-api-access-d2txz\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg\" (UID: \"67a674cd-ba01-41f4-877c-55adb3dd845c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" Apr 16 23:33:43.291449 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.291361 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67a674cd-ba01-41f4-877c-55adb3dd845c-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg\" (UID: \"67a674cd-ba01-41f4-877c-55adb3dd845c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" Apr 16 23:33:43.392528 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.392478 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67a674cd-ba01-41f4-877c-55adb3dd845c-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg\" (UID: \"67a674cd-ba01-41f4-877c-55adb3dd845c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" Apr 16 23:33:43.392738 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.392561 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67a674cd-ba01-41f4-877c-55adb3dd845c-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg\" (UID: \"67a674cd-ba01-41f4-877c-55adb3dd845c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" Apr 16 23:33:43.392738 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.392589 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2txz\" (UniqueName: \"kubernetes.io/projected/67a674cd-ba01-41f4-877c-55adb3dd845c-kube-api-access-d2txz\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg\" (UID: \"67a674cd-ba01-41f4-877c-55adb3dd845c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" Apr 16 23:33:43.392983 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.392961 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67a674cd-ba01-41f4-877c-55adb3dd845c-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg\" (UID: \"67a674cd-ba01-41f4-877c-55adb3dd845c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" Apr 16 23:33:43.393024 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.392969 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67a674cd-ba01-41f4-877c-55adb3dd845c-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg\" (UID: \"67a674cd-ba01-41f4-877c-55adb3dd845c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" Apr 16 23:33:43.400385 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.400356 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2txz\" (UniqueName: \"kubernetes.io/projected/67a674cd-ba01-41f4-877c-55adb3dd845c-kube-api-access-d2txz\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg\" (UID: \"67a674cd-ba01-41f4-877c-55adb3dd845c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" Apr 16 23:33:43.581941 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.581858 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" Apr 16 23:33:43.651181 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.651141 2564 generic.go:358] "Generic (PLEG): container finished" podID="797c63f2-40b6-41f4-95ec-c57ed738695a" containerID="8b1239b3cf4ad07530b9fe0db3a591ce2c6311a24d424b3df49c3a350e61b20b" exitCode=0 Apr 16 23:33:43.651339 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.651222 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" event={"ID":"797c63f2-40b6-41f4-95ec-c57ed738695a","Type":"ContainerDied","Data":"8b1239b3cf4ad07530b9fe0db3a591ce2c6311a24d424b3df49c3a350e61b20b"} Apr 16 23:33:43.651339 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.651271 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" event={"ID":"797c63f2-40b6-41f4-95ec-c57ed738695a","Type":"ContainerStarted","Data":"1f62f157705cafc1c95dfbbc03d0b8a60a6f9768b1efac3602483977c3a159ff"} Apr 16 23:33:43.666669 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.666635 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs"] Apr 16 23:33:43.672360 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.672335 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" Apr 16 23:33:43.683558 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.683255 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs"] Apr 16 23:33:43.694712 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.694667 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs\" (UID: \"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" Apr 16 23:33:43.694834 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.694814 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnldf\" (UniqueName: \"kubernetes.io/projected/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-kube-api-access-fnldf\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs\" (UID: \"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" Apr 16 23:33:43.694883 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.694847 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs\" (UID: \"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" Apr 16 23:33:43.715266 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.715244 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg"] Apr 16 23:33:43.716901 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:33:43.716877 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67a674cd_ba01_41f4_877c_55adb3dd845c.slice/crio-ce7d663bc1d8e94b3acf3569be402dd9561093c7f7bad9fff546b517fba83ae4 WatchSource:0}: Error finding container ce7d663bc1d8e94b3acf3569be402dd9561093c7f7bad9fff546b517fba83ae4: Status 404 returned error can't find the container with id ce7d663bc1d8e94b3acf3569be402dd9561093c7f7bad9fff546b517fba83ae4 Apr 16 23:33:43.795678 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.795646 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnldf\" (UniqueName: \"kubernetes.io/projected/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-kube-api-access-fnldf\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs\" (UID: \"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" Apr 16 23:33:43.795866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.795691 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs\" (UID: \"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" Apr 16 23:33:43.795866 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.795741 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs\" (UID: \"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" Apr 16 23:33:43.796086 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.796063 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs\" (UID: \"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" Apr 16 23:33:43.796086 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.796078 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs\" (UID: \"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" Apr 16 23:33:43.803796 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.803770 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnldf\" (UniqueName: \"kubernetes.io/projected/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-kube-api-access-fnldf\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs\" (UID: \"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" Apr 16 23:33:43.995157 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:43.995126 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" Apr 16 23:33:44.073687 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.073659 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss"] Apr 16 23:33:44.078772 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.078749 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" Apr 16 23:33:44.084550 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.084518 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss"] Apr 16 23:33:44.097668 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.097636 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss\" (UID: \"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" Apr 16 23:33:44.097810 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.097687 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss\" (UID: \"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" Apr 16 23:33:44.097810 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.097767 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwsxv\" (UniqueName: \"kubernetes.io/projected/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-kube-api-access-lwsxv\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss\" (UID: \"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" Apr 16 23:33:44.120054 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.120028 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs"] Apr 16 23:33:44.121976 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:33:44.121946 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad4bf82c_9cdd_4e1c_805d_88353fa74ad6.slice/crio-9d58ce3a56634885b649a1c6388b088bd929a16e690f5be544201aea6523d6d8 WatchSource:0}: Error finding container 9d58ce3a56634885b649a1c6388b088bd929a16e690f5be544201aea6523d6d8: Status 404 returned error can't find the container with id 9d58ce3a56634885b649a1c6388b088bd929a16e690f5be544201aea6523d6d8 Apr 16 23:33:44.198951 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.198911 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss\" (UID: \"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" Apr 16 23:33:44.198951 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.198954 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss\" (UID: \"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" Apr 16 23:33:44.199167 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.199087 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwsxv\" (UniqueName: \"kubernetes.io/projected/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-kube-api-access-lwsxv\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss\" (UID: \"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" Apr 16 23:33:44.199282 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.199264 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss\" (UID: \"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" Apr 16 23:33:44.199349 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.199326 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss\" (UID: \"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" Apr 16 23:33:44.206415 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.206390 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwsxv\" (UniqueName: \"kubernetes.io/projected/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-kube-api-access-lwsxv\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss\" (UID: \"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" Apr 16 23:33:44.390756 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.390729 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" Apr 16 23:33:44.515553 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.515528 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss"] Apr 16 23:33:44.516789 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:33:44.516757 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod221ea960_cf1a_46e4_a80b_8a7bd2d9b84a.slice/crio-db2dfd689b6fefe6adc0e6738d8cb2e9f7c7a0c4f67eee50e00f1de3d2c28eba WatchSource:0}: Error finding container db2dfd689b6fefe6adc0e6738d8cb2e9f7c7a0c4f67eee50e00f1de3d2c28eba: Status 404 returned error can't find the container with id db2dfd689b6fefe6adc0e6738d8cb2e9f7c7a0c4f67eee50e00f1de3d2c28eba Apr 16 23:33:44.656973 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.656934 2564 generic.go:358] "Generic (PLEG): container finished" podID="797c63f2-40b6-41f4-95ec-c57ed738695a" containerID="d78e77cf6f8df1f758c39050047ea1ee787b4a9a98d3d56350cc296b2560df56" exitCode=0 Apr 16 23:33:44.657146 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.657011 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" event={"ID":"797c63f2-40b6-41f4-95ec-c57ed738695a","Type":"ContainerDied","Data":"d78e77cf6f8df1f758c39050047ea1ee787b4a9a98d3d56350cc296b2560df56"} Apr 16 23:33:44.658471 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.658448 2564 generic.go:358] "Generic (PLEG): container finished" podID="221ea960-cf1a-46e4-a80b-8a7bd2d9b84a" containerID="dbb486d6617d7a2b01dc8d07911d0a67e629e5ed8829c1beadd9ba9baa98be70" exitCode=0 Apr 16 23:33:44.658575 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.658534 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" event={"ID":"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a","Type":"ContainerDied","Data":"dbb486d6617d7a2b01dc8d07911d0a67e629e5ed8829c1beadd9ba9baa98be70"} Apr 16 23:33:44.658575 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.658564 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" event={"ID":"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a","Type":"ContainerStarted","Data":"db2dfd689b6fefe6adc0e6738d8cb2e9f7c7a0c4f67eee50e00f1de3d2c28eba"} Apr 16 23:33:44.660091 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.660068 2564 generic.go:358] "Generic (PLEG): container finished" podID="ad4bf82c-9cdd-4e1c-805d-88353fa74ad6" containerID="11ea9f3b0f5575267efb2ca62fc9e45453fb1e2e22126542a2bca91aac3be960" exitCode=0 Apr 16 23:33:44.660205 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.660108 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" event={"ID":"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6","Type":"ContainerDied","Data":"11ea9f3b0f5575267efb2ca62fc9e45453fb1e2e22126542a2bca91aac3be960"} Apr 16 23:33:44.660205 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.660144 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" event={"ID":"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6","Type":"ContainerStarted","Data":"9d58ce3a56634885b649a1c6388b088bd929a16e690f5be544201aea6523d6d8"} Apr 16 23:33:44.661756 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.661734 2564 generic.go:358] "Generic (PLEG): container finished" podID="67a674cd-ba01-41f4-877c-55adb3dd845c" containerID="340ddfd78d05c0d2ad482da44b15421234a367a40544c438b0f3f5e91d9cc229" exitCode=0 Apr 16 23:33:44.661839 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.661774 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" event={"ID":"67a674cd-ba01-41f4-877c-55adb3dd845c","Type":"ContainerDied","Data":"340ddfd78d05c0d2ad482da44b15421234a367a40544c438b0f3f5e91d9cc229"} Apr 16 23:33:44.661839 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:44.661792 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" event={"ID":"67a674cd-ba01-41f4-877c-55adb3dd845c","Type":"ContainerStarted","Data":"ce7d663bc1d8e94b3acf3569be402dd9561093c7f7bad9fff546b517fba83ae4"} Apr 16 23:33:45.669062 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:45.668897 2564 generic.go:358] "Generic (PLEG): container finished" podID="797c63f2-40b6-41f4-95ec-c57ed738695a" containerID="bc76c165f37ba34f727d2193e23d5a910776133c1f237350b31376cd8acab6c5" exitCode=0 Apr 16 23:33:45.669062 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:45.668978 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" event={"ID":"797c63f2-40b6-41f4-95ec-c57ed738695a","Type":"ContainerDied","Data":"bc76c165f37ba34f727d2193e23d5a910776133c1f237350b31376cd8acab6c5"} Apr 16 23:33:45.670675 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:45.670653 2564 generic.go:358] "Generic (PLEG): container finished" podID="221ea960-cf1a-46e4-a80b-8a7bd2d9b84a" containerID="fa6bcef5af324095b1187dbfe94586fa08c4dc597838bd97a39cb5dbcb951e0a" exitCode=0 Apr 16 23:33:45.670801 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:45.670737 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" event={"ID":"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a","Type":"ContainerDied","Data":"fa6bcef5af324095b1187dbfe94586fa08c4dc597838bd97a39cb5dbcb951e0a"} Apr 16 23:33:45.672539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:45.672512 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" event={"ID":"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6","Type":"ContainerStarted","Data":"ffc122fabcc00fde67c96af90551abd1a69a445e32bb85b30772fd20060e7062"} Apr 16 23:33:45.674410 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:45.674386 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" event={"ID":"67a674cd-ba01-41f4-877c-55adb3dd845c","Type":"ContainerStarted","Data":"b44e549b581291f6ee016bd550b5fa27b91f8343e8b3fe3b88c18c098868f063"} Apr 16 23:33:46.679603 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:46.679570 2564 generic.go:358] "Generic (PLEG): container finished" podID="67a674cd-ba01-41f4-877c-55adb3dd845c" containerID="b44e549b581291f6ee016bd550b5fa27b91f8343e8b3fe3b88c18c098868f063" exitCode=0 Apr 16 23:33:46.680071 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:46.679659 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" event={"ID":"67a674cd-ba01-41f4-877c-55adb3dd845c","Type":"ContainerDied","Data":"b44e549b581291f6ee016bd550b5fa27b91f8343e8b3fe3b88c18c098868f063"} Apr 16 23:33:46.681643 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:46.681623 2564 generic.go:358] "Generic (PLEG): container finished" podID="221ea960-cf1a-46e4-a80b-8a7bd2d9b84a" containerID="4b932a0db00f64a6c406f88ce18bae891d9930b801f26aa1f777c19dfcde2e40" exitCode=0 Apr 16 23:33:46.681761 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:46.681716 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" event={"ID":"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a","Type":"ContainerDied","Data":"4b932a0db00f64a6c406f88ce18bae891d9930b801f26aa1f777c19dfcde2e40"} Apr 16 23:33:46.686265 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:46.686216 2564 generic.go:358] "Generic (PLEG): container finished" podID="ad4bf82c-9cdd-4e1c-805d-88353fa74ad6" containerID="ffc122fabcc00fde67c96af90551abd1a69a445e32bb85b30772fd20060e7062" exitCode=0 Apr 16 23:33:46.686385 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:46.686359 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" event={"ID":"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6","Type":"ContainerDied","Data":"ffc122fabcc00fde67c96af90551abd1a69a445e32bb85b30772fd20060e7062"} Apr 16 23:33:46.826415 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:46.826389 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" Apr 16 23:33:46.923534 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:46.923505 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/797c63f2-40b6-41f4-95ec-c57ed738695a-util\") pod \"797c63f2-40b6-41f4-95ec-c57ed738695a\" (UID: \"797c63f2-40b6-41f4-95ec-c57ed738695a\") " Apr 16 23:33:46.923665 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:46.923546 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz4xv\" (UniqueName: \"kubernetes.io/projected/797c63f2-40b6-41f4-95ec-c57ed738695a-kube-api-access-bz4xv\") pod \"797c63f2-40b6-41f4-95ec-c57ed738695a\" (UID: \"797c63f2-40b6-41f4-95ec-c57ed738695a\") " Apr 16 23:33:46.923665 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:46.923584 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/797c63f2-40b6-41f4-95ec-c57ed738695a-bundle\") pod \"797c63f2-40b6-41f4-95ec-c57ed738695a\" (UID: \"797c63f2-40b6-41f4-95ec-c57ed738695a\") " Apr 16 23:33:46.924060 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:46.924038 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/797c63f2-40b6-41f4-95ec-c57ed738695a-bundle" (OuterVolumeSpecName: "bundle") pod "797c63f2-40b6-41f4-95ec-c57ed738695a" (UID: "797c63f2-40b6-41f4-95ec-c57ed738695a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:33:46.925802 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:46.925778 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797c63f2-40b6-41f4-95ec-c57ed738695a-kube-api-access-bz4xv" (OuterVolumeSpecName: "kube-api-access-bz4xv") pod "797c63f2-40b6-41f4-95ec-c57ed738695a" (UID: "797c63f2-40b6-41f4-95ec-c57ed738695a"). InnerVolumeSpecName "kube-api-access-bz4xv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:33:46.929347 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:46.929312 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/797c63f2-40b6-41f4-95ec-c57ed738695a-util" (OuterVolumeSpecName: "util") pod "797c63f2-40b6-41f4-95ec-c57ed738695a" (UID: "797c63f2-40b6-41f4-95ec-c57ed738695a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:33:47.024494 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.024461 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/797c63f2-40b6-41f4-95ec-c57ed738695a-bundle\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:33:47.024494 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.024489 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/797c63f2-40b6-41f4-95ec-c57ed738695a-util\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:33:47.024494 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.024499 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bz4xv\" (UniqueName: \"kubernetes.io/projected/797c63f2-40b6-41f4-95ec-c57ed738695a-kube-api-access-bz4xv\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:33:47.691810 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.691773 2564 generic.go:358] "Generic (PLEG): container finished" podID="ad4bf82c-9cdd-4e1c-805d-88353fa74ad6" containerID="65a379cb63c6968a782c7888a02f662bab729b98a8e60e25762ba8a6456ba122" exitCode=0 Apr 16 23:33:47.692220 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.691828 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" event={"ID":"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6","Type":"ContainerDied","Data":"65a379cb63c6968a782c7888a02f662bab729b98a8e60e25762ba8a6456ba122"} Apr 16 23:33:47.693665 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.693637 2564 generic.go:358] "Generic (PLEG): container finished" podID="67a674cd-ba01-41f4-877c-55adb3dd845c" containerID="2ff6982846d5e7a1de6b7c2275caaa48eb3cac1642774dec1eb5ac3f9daf4ea1" exitCode=0 Apr 16 23:33:47.693810 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.693731 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" event={"ID":"67a674cd-ba01-41f4-877c-55adb3dd845c","Type":"ContainerDied","Data":"2ff6982846d5e7a1de6b7c2275caaa48eb3cac1642774dec1eb5ac3f9daf4ea1"} Apr 16 23:33:47.695412 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.695395 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" Apr 16 23:33:47.695412 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.695408 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8" event={"ID":"797c63f2-40b6-41f4-95ec-c57ed738695a","Type":"ContainerDied","Data":"1f62f157705cafc1c95dfbbc03d0b8a60a6f9768b1efac3602483977c3a159ff"} Apr 16 23:33:47.695574 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.695430 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f62f157705cafc1c95dfbbc03d0b8a60a6f9768b1efac3602483977c3a159ff" Apr 16 23:33:47.821986 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.821964 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" Apr 16 23:33:47.931809 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.931773 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-util\") pod \"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a\" (UID: \"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a\") " Apr 16 23:33:47.932010 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.931828 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-bundle\") pod \"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a\" (UID: \"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a\") " Apr 16 23:33:47.932010 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.931881 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwsxv\" (UniqueName: \"kubernetes.io/projected/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-kube-api-access-lwsxv\") pod \"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a\" (UID: \"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a\") " Apr 16 23:33:47.932485 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.932459 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-bundle" (OuterVolumeSpecName: "bundle") pod "221ea960-cf1a-46e4-a80b-8a7bd2d9b84a" (UID: "221ea960-cf1a-46e4-a80b-8a7bd2d9b84a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:33:47.933960 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.933933 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-kube-api-access-lwsxv" (OuterVolumeSpecName: "kube-api-access-lwsxv") pod "221ea960-cf1a-46e4-a80b-8a7bd2d9b84a" (UID: "221ea960-cf1a-46e4-a80b-8a7bd2d9b84a"). InnerVolumeSpecName "kube-api-access-lwsxv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:33:47.937088 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:47.937051 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-util" (OuterVolumeSpecName: "util") pod "221ea960-cf1a-46e4-a80b-8a7bd2d9b84a" (UID: "221ea960-cf1a-46e4-a80b-8a7bd2d9b84a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:33:48.032547 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.032517 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-util\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:33:48.032547 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.032548 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-bundle\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:33:48.032751 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.032562 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lwsxv\" (UniqueName: \"kubernetes.io/projected/221ea960-cf1a-46e4-a80b-8a7bd2d9b84a-kube-api-access-lwsxv\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:33:48.700840 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.700803 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" Apr 16 23:33:48.700840 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.700809 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss" event={"ID":"221ea960-cf1a-46e4-a80b-8a7bd2d9b84a","Type":"ContainerDied","Data":"db2dfd689b6fefe6adc0e6738d8cb2e9f7c7a0c4f67eee50e00f1de3d2c28eba"} Apr 16 23:33:48.700840 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.700842 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db2dfd689b6fefe6adc0e6738d8cb2e9f7c7a0c4f67eee50e00f1de3d2c28eba" Apr 16 23:33:48.831507 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.831484 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" Apr 16 23:33:48.861015 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.860988 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" Apr 16 23:33:48.940176 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.940142 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnldf\" (UniqueName: \"kubernetes.io/projected/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-kube-api-access-fnldf\") pod \"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6\" (UID: \"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6\") " Apr 16 23:33:48.940369 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.940224 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67a674cd-ba01-41f4-877c-55adb3dd845c-util\") pod \"67a674cd-ba01-41f4-877c-55adb3dd845c\" (UID: \"67a674cd-ba01-41f4-877c-55adb3dd845c\") " Apr 16 23:33:48.940369 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.940250 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67a674cd-ba01-41f4-877c-55adb3dd845c-bundle\") pod \"67a674cd-ba01-41f4-877c-55adb3dd845c\" (UID: \"67a674cd-ba01-41f4-877c-55adb3dd845c\") " Apr 16 23:33:48.940369 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.940279 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2txz\" (UniqueName: \"kubernetes.io/projected/67a674cd-ba01-41f4-877c-55adb3dd845c-kube-api-access-d2txz\") pod \"67a674cd-ba01-41f4-877c-55adb3dd845c\" (UID: \"67a674cd-ba01-41f4-877c-55adb3dd845c\") " Apr 16 23:33:48.940369 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.940310 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-bundle\") pod \"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6\" (UID: \"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6\") " Apr 16 23:33:48.940369 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.940353 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-util\") pod \"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6\" (UID: \"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6\") " Apr 16 23:33:48.940868 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.940837 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-bundle" (OuterVolumeSpecName: "bundle") pod "ad4bf82c-9cdd-4e1c-805d-88353fa74ad6" (UID: "ad4bf82c-9cdd-4e1c-805d-88353fa74ad6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:33:48.940983 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.940882 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67a674cd-ba01-41f4-877c-55adb3dd845c-bundle" (OuterVolumeSpecName: "bundle") pod "67a674cd-ba01-41f4-877c-55adb3dd845c" (UID: "67a674cd-ba01-41f4-877c-55adb3dd845c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:33:48.942926 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.942899 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67a674cd-ba01-41f4-877c-55adb3dd845c-kube-api-access-d2txz" (OuterVolumeSpecName: "kube-api-access-d2txz") pod "67a674cd-ba01-41f4-877c-55adb3dd845c" (UID: "67a674cd-ba01-41f4-877c-55adb3dd845c"). InnerVolumeSpecName "kube-api-access-d2txz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:33:48.943011 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.942907 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-kube-api-access-fnldf" (OuterVolumeSpecName: "kube-api-access-fnldf") pod "ad4bf82c-9cdd-4e1c-805d-88353fa74ad6" (UID: "ad4bf82c-9cdd-4e1c-805d-88353fa74ad6"). InnerVolumeSpecName "kube-api-access-fnldf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:33:48.948562 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.948521 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67a674cd-ba01-41f4-877c-55adb3dd845c-util" (OuterVolumeSpecName: "util") pod "67a674cd-ba01-41f4-877c-55adb3dd845c" (UID: "67a674cd-ba01-41f4-877c-55adb3dd845c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:33:48.949073 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:48.949041 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-util" (OuterVolumeSpecName: "util") pod "ad4bf82c-9cdd-4e1c-805d-88353fa74ad6" (UID: "ad4bf82c-9cdd-4e1c-805d-88353fa74ad6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:33:49.041979 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:49.041938 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67a674cd-ba01-41f4-877c-55adb3dd845c-util\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:33:49.041979 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:49.041977 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67a674cd-ba01-41f4-877c-55adb3dd845c-bundle\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:33:49.042178 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:49.041993 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d2txz\" (UniqueName: \"kubernetes.io/projected/67a674cd-ba01-41f4-877c-55adb3dd845c-kube-api-access-d2txz\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:33:49.042178 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:49.042010 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-bundle\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:33:49.042178 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:49.042025 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-util\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:33:49.042178 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:49.042039 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fnldf\" (UniqueName: \"kubernetes.io/projected/ad4bf82c-9cdd-4e1c-805d-88353fa74ad6-kube-api-access-fnldf\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:33:49.706729 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:49.706675 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" Apr 16 23:33:49.707140 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:49.706731 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg" event={"ID":"67a674cd-ba01-41f4-877c-55adb3dd845c","Type":"ContainerDied","Data":"ce7d663bc1d8e94b3acf3569be402dd9561093c7f7bad9fff546b517fba83ae4"} Apr 16 23:33:49.707140 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:49.706764 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce7d663bc1d8e94b3acf3569be402dd9561093c7f7bad9fff546b517fba83ae4" Apr 16 23:33:49.708454 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:49.708435 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" event={"ID":"ad4bf82c-9cdd-4e1c-805d-88353fa74ad6","Type":"ContainerDied","Data":"9d58ce3a56634885b649a1c6388b088bd929a16e690f5be544201aea6523d6d8"} Apr 16 23:33:49.708454 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:49.708457 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d58ce3a56634885b649a1c6388b088bd929a16e690f5be544201aea6523d6d8" Apr 16 23:33:49.708591 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:33:49.708495 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs" Apr 16 23:34:00.911921 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.911878 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-79d7cf776-sspzd"] Apr 16 23:34:00.912423 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912387 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad4bf82c-9cdd-4e1c-805d-88353fa74ad6" containerName="extract" Apr 16 23:34:00.912423 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912409 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4bf82c-9cdd-4e1c-805d-88353fa74ad6" containerName="extract" Apr 16 23:34:00.912423 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912425 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67a674cd-ba01-41f4-877c-55adb3dd845c" containerName="pull" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912434 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a674cd-ba01-41f4-877c-55adb3dd845c" containerName="pull" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912446 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="221ea960-cf1a-46e4-a80b-8a7bd2d9b84a" containerName="extract" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912455 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="221ea960-cf1a-46e4-a80b-8a7bd2d9b84a" containerName="extract" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912470 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="797c63f2-40b6-41f4-95ec-c57ed738695a" containerName="extract" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912478 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="797c63f2-40b6-41f4-95ec-c57ed738695a" containerName="extract" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912489 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67a674cd-ba01-41f4-877c-55adb3dd845c" containerName="extract" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912497 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a674cd-ba01-41f4-877c-55adb3dd845c" containerName="extract" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912512 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67a674cd-ba01-41f4-877c-55adb3dd845c" containerName="util" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912521 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a674cd-ba01-41f4-877c-55adb3dd845c" containerName="util" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912537 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="797c63f2-40b6-41f4-95ec-c57ed738695a" containerName="util" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912545 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="797c63f2-40b6-41f4-95ec-c57ed738695a" containerName="util" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912554 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="221ea960-cf1a-46e4-a80b-8a7bd2d9b84a" containerName="util" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912562 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="221ea960-cf1a-46e4-a80b-8a7bd2d9b84a" containerName="util" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912574 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="797c63f2-40b6-41f4-95ec-c57ed738695a" containerName="pull" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912583 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="797c63f2-40b6-41f4-95ec-c57ed738695a" containerName="pull" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912591 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="221ea960-cf1a-46e4-a80b-8a7bd2d9b84a" containerName="pull" Apr 16 23:34:00.912597 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912600 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="221ea960-cf1a-46e4-a80b-8a7bd2d9b84a" containerName="pull" Apr 16 23:34:00.913441 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912614 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad4bf82c-9cdd-4e1c-805d-88353fa74ad6" containerName="util" Apr 16 23:34:00.913441 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912622 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4bf82c-9cdd-4e1c-805d-88353fa74ad6" containerName="util" Apr 16 23:34:00.913441 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912635 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad4bf82c-9cdd-4e1c-805d-88353fa74ad6" containerName="pull" Apr 16 23:34:00.913441 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912643 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4bf82c-9cdd-4e1c-805d-88353fa74ad6" containerName="pull" Apr 16 23:34:00.913441 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912750 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="67a674cd-ba01-41f4-877c-55adb3dd845c" containerName="extract" Apr 16 23:34:00.913441 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912768 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="797c63f2-40b6-41f4-95ec-c57ed738695a" containerName="extract" Apr 16 23:34:00.913441 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912779 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad4bf82c-9cdd-4e1c-805d-88353fa74ad6" containerName="extract" Apr 16 23:34:00.913441 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.912792 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="221ea960-cf1a-46e4-a80b-8a7bd2d9b84a" containerName="extract" Apr 16 23:34:00.925076 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.925045 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:00.926820 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:00.926794 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79d7cf776-sspzd"] Apr 16 23:34:01.049169 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.049131 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b200247f-e2ef-48e2-95a2-daad0e34e8f7-service-ca\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.049330 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.049182 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b200247f-e2ef-48e2-95a2-daad0e34e8f7-console-oauth-config\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.049330 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.049210 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b200247f-e2ef-48e2-95a2-daad0e34e8f7-console-serving-cert\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.049330 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.049235 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b200247f-e2ef-48e2-95a2-daad0e34e8f7-console-config\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.049330 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.049254 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b200247f-e2ef-48e2-95a2-daad0e34e8f7-trusted-ca-bundle\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.049330 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.049273 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b200247f-e2ef-48e2-95a2-daad0e34e8f7-oauth-serving-cert\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.049330 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.049309 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf4hw\" (UniqueName: \"kubernetes.io/projected/b200247f-e2ef-48e2-95a2-daad0e34e8f7-kube-api-access-kf4hw\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.150311 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.150272 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kf4hw\" (UniqueName: \"kubernetes.io/projected/b200247f-e2ef-48e2-95a2-daad0e34e8f7-kube-api-access-kf4hw\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.150479 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.150418 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b200247f-e2ef-48e2-95a2-daad0e34e8f7-service-ca\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.150479 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.150450 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b200247f-e2ef-48e2-95a2-daad0e34e8f7-console-oauth-config\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.150479 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.150471 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b200247f-e2ef-48e2-95a2-daad0e34e8f7-console-serving-cert\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.150621 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.150494 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b200247f-e2ef-48e2-95a2-daad0e34e8f7-console-config\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.150621 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.150518 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b200247f-e2ef-48e2-95a2-daad0e34e8f7-trusted-ca-bundle\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.150621 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.150547 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b200247f-e2ef-48e2-95a2-daad0e34e8f7-oauth-serving-cert\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.151267 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.151229 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b200247f-e2ef-48e2-95a2-daad0e34e8f7-console-config\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.151365 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.151275 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b200247f-e2ef-48e2-95a2-daad0e34e8f7-service-ca\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.151365 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.151304 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b200247f-e2ef-48e2-95a2-daad0e34e8f7-oauth-serving-cert\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.151460 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.151441 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b200247f-e2ef-48e2-95a2-daad0e34e8f7-trusted-ca-bundle\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.153499 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.153479 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b200247f-e2ef-48e2-95a2-daad0e34e8f7-console-serving-cert\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.153611 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.153591 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b200247f-e2ef-48e2-95a2-daad0e34e8f7-console-oauth-config\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.159840 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.159819 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf4hw\" (UniqueName: \"kubernetes.io/projected/b200247f-e2ef-48e2-95a2-daad0e34e8f7-kube-api-access-kf4hw\") pod \"console-79d7cf776-sspzd\" (UID: \"b200247f-e2ef-48e2-95a2-daad0e34e8f7\") " pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.237987 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.237947 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:01.370216 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.370176 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79d7cf776-sspzd"] Apr 16 23:34:01.372503 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:34:01.372471 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb200247f_e2ef_48e2_95a2_daad0e34e8f7.slice/crio-4424e4a603685550cf9e08b9aff48ab301795f2d6d0e79c3ed86c08813fb3055 WatchSource:0}: Error finding container 4424e4a603685550cf9e08b9aff48ab301795f2d6d0e79c3ed86c08813fb3055: Status 404 returned error can't find the container with id 4424e4a603685550cf9e08b9aff48ab301795f2d6d0e79c3ed86c08813fb3055 Apr 16 23:34:01.754747 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.754713 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79d7cf776-sspzd" event={"ID":"b200247f-e2ef-48e2-95a2-daad0e34e8f7","Type":"ContainerStarted","Data":"4de007f4f8808236e070e3896d01b6863688708c628cf28dcabb32e01e731a67"} Apr 16 23:34:01.754747 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.754752 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79d7cf776-sspzd" event={"ID":"b200247f-e2ef-48e2-95a2-daad0e34e8f7","Type":"ContainerStarted","Data":"4424e4a603685550cf9e08b9aff48ab301795f2d6d0e79c3ed86c08813fb3055"} Apr 16 23:34:01.771345 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:01.771294 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79d7cf776-sspzd" podStartSLOduration=1.77127978 podStartE2EDuration="1.77127978s" podCreationTimestamp="2026-04-16 23:34:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:34:01.769844078 +0000 UTC m=+482.385875557" watchObservedRunningTime="2026-04-16 23:34:01.77127978 +0000 UTC m=+482.387311259" Apr 16 23:34:11.238662 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:11.238623 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:11.238662 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:11.238665 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:11.243850 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:11.243826 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:11.796952 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:11.796915 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79d7cf776-sspzd" Apr 16 23:34:11.885655 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:11.885620 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9c56f4866-cjq25"] Apr 16 23:34:18.345123 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.345091 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd"] Apr 16 23:34:18.348649 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.348631 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd" Apr 16 23:34:18.350819 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.350797 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 23:34:18.350948 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.350929 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-26zbs\"" Apr 16 23:34:18.351011 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.350951 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 23:34:18.351068 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.351027 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 23:34:18.351747 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.351725 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 23:34:18.354924 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.354897 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd"] Apr 16 23:34:18.505254 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.505219 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7854\" (UniqueName: \"kubernetes.io/projected/7d2396f2-e57a-46cb-a71a-de7349961bbd-kube-api-access-x7854\") pod \"kuadrant-console-plugin-6cb54b5c86-klfkd\" (UID: \"7d2396f2-e57a-46cb-a71a-de7349961bbd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd" Apr 16 23:34:18.505404 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.505278 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7d2396f2-e57a-46cb-a71a-de7349961bbd-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-klfkd\" (UID: \"7d2396f2-e57a-46cb-a71a-de7349961bbd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd" Apr 16 23:34:18.505404 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.505348 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d2396f2-e57a-46cb-a71a-de7349961bbd-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-klfkd\" (UID: \"7d2396f2-e57a-46cb-a71a-de7349961bbd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd" Apr 16 23:34:18.606426 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.606320 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7d2396f2-e57a-46cb-a71a-de7349961bbd-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-klfkd\" (UID: \"7d2396f2-e57a-46cb-a71a-de7349961bbd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd" Apr 16 23:34:18.606426 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.606377 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d2396f2-e57a-46cb-a71a-de7349961bbd-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-klfkd\" (UID: \"7d2396f2-e57a-46cb-a71a-de7349961bbd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd" Apr 16 23:34:18.606634 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.606464 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7854\" (UniqueName: \"kubernetes.io/projected/7d2396f2-e57a-46cb-a71a-de7349961bbd-kube-api-access-x7854\") pod \"kuadrant-console-plugin-6cb54b5c86-klfkd\" (UID: \"7d2396f2-e57a-46cb-a71a-de7349961bbd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd" Apr 16 23:34:18.607021 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.606993 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7d2396f2-e57a-46cb-a71a-de7349961bbd-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-klfkd\" (UID: \"7d2396f2-e57a-46cb-a71a-de7349961bbd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd" Apr 16 23:34:18.608887 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.608864 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d2396f2-e57a-46cb-a71a-de7349961bbd-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-klfkd\" (UID: \"7d2396f2-e57a-46cb-a71a-de7349961bbd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd" Apr 16 23:34:18.613585 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.613563 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7854\" (UniqueName: \"kubernetes.io/projected/7d2396f2-e57a-46cb-a71a-de7349961bbd-kube-api-access-x7854\") pod \"kuadrant-console-plugin-6cb54b5c86-klfkd\" (UID: \"7d2396f2-e57a-46cb-a71a-de7349961bbd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd" Apr 16 23:34:18.659727 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.659687 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd" Apr 16 23:34:18.811900 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.811874 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd"] Apr 16 23:34:18.813448 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:34:18.813416 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d2396f2_e57a_46cb_a71a_de7349961bbd.slice/crio-1f95fbf6e5bfd94c3873ff5caecfd57007d50b13c2235e486cd96b9e94fbc89c WatchSource:0}: Error finding container 1f95fbf6e5bfd94c3873ff5caecfd57007d50b13c2235e486cd96b9e94fbc89c: Status 404 returned error can't find the container with id 1f95fbf6e5bfd94c3873ff5caecfd57007d50b13c2235e486cd96b9e94fbc89c Apr 16 23:34:18.821858 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:18.821825 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd" event={"ID":"7d2396f2-e57a-46cb-a71a-de7349961bbd","Type":"ContainerStarted","Data":"1f95fbf6e5bfd94c3873ff5caecfd57007d50b13c2235e486cd96b9e94fbc89c"} Apr 16 23:34:36.905603 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:36.905551 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-9c56f4866-cjq25" podUID="a868195b-60e2-4cfe-9cbf-ff92b5125a73" containerName="console" containerID="cri-o://b2031889a7407343be2ba9ff8dce02d7123f1360b0f32afc414896eb1021812c" gracePeriod=15 Apr 16 23:34:36.911088 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:36.910517 2564 patch_prober.go:28] interesting pod/console-9c56f4866-cjq25 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.133.0.23:8443/health\": dial tcp 10.133.0.23:8443: connect: connection refused" start-of-body= Apr 16 23:34:36.911088 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:36.910599 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-9c56f4866-cjq25" podUID="a868195b-60e2-4cfe-9cbf-ff92b5125a73" containerName="console" probeResult="failure" output="Get \"https://10.133.0.23:8443/health\": dial tcp 10.133.0.23:8443: connect: connection refused" Apr 16 23:34:41.833850 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.833829 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9c56f4866-cjq25_a868195b-60e2-4cfe-9cbf-ff92b5125a73/console/0.log" Apr 16 23:34:41.834126 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.833894 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:34:41.918857 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.918826 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-service-ca\") pod \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " Apr 16 23:34:41.919033 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.918878 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-serving-cert\") pod \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " Apr 16 23:34:41.919033 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.918926 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-trusted-ca-bundle\") pod \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " Apr 16 23:34:41.919033 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.918965 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-config\") pod \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " Apr 16 23:34:41.919033 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.918997 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjxnh\" (UniqueName: \"kubernetes.io/projected/a868195b-60e2-4cfe-9cbf-ff92b5125a73-kube-api-access-bjxnh\") pod \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " Apr 16 23:34:41.919246 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.919091 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-oauth-config\") pod \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " Apr 16 23:34:41.919304 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.919250 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-service-ca" (OuterVolumeSpecName: "service-ca") pod "a868195b-60e2-4cfe-9cbf-ff92b5125a73" (UID: "a868195b-60e2-4cfe-9cbf-ff92b5125a73"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:34:41.919372 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.919345 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-config" (OuterVolumeSpecName: "console-config") pod "a868195b-60e2-4cfe-9cbf-ff92b5125a73" (UID: "a868195b-60e2-4cfe-9cbf-ff92b5125a73"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:34:41.919433 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.919386 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a868195b-60e2-4cfe-9cbf-ff92b5125a73" (UID: "a868195b-60e2-4cfe-9cbf-ff92b5125a73"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:34:41.919562 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.919542 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-oauth-serving-cert\") pod \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\" (UID: \"a868195b-60e2-4cfe-9cbf-ff92b5125a73\") " Apr 16 23:34:41.919859 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.919735 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a868195b-60e2-4cfe-9cbf-ff92b5125a73" (UID: "a868195b-60e2-4cfe-9cbf-ff92b5125a73"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:34:41.920114 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.920096 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-oauth-serving-cert\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:34:41.920199 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.920119 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-service-ca\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:34:41.920199 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.920134 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-trusted-ca-bundle\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:34:41.920199 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.920147 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-config\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:34:41.921309 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.921286 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a868195b-60e2-4cfe-9cbf-ff92b5125a73" (UID: "a868195b-60e2-4cfe-9cbf-ff92b5125a73"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:34:41.921309 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.921298 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a868195b-60e2-4cfe-9cbf-ff92b5125a73-kube-api-access-bjxnh" (OuterVolumeSpecName: "kube-api-access-bjxnh") pod "a868195b-60e2-4cfe-9cbf-ff92b5125a73" (UID: "a868195b-60e2-4cfe-9cbf-ff92b5125a73"). InnerVolumeSpecName "kube-api-access-bjxnh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:34:41.921551 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.921515 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a868195b-60e2-4cfe-9cbf-ff92b5125a73" (UID: "a868195b-60e2-4cfe-9cbf-ff92b5125a73"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:34:41.922763 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.922747 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9c56f4866-cjq25_a868195b-60e2-4cfe-9cbf-ff92b5125a73/console/0.log" Apr 16 23:34:41.922843 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.922785 2564 generic.go:358] "Generic (PLEG): container finished" podID="a868195b-60e2-4cfe-9cbf-ff92b5125a73" containerID="b2031889a7407343be2ba9ff8dce02d7123f1360b0f32afc414896eb1021812c" exitCode=2 Apr 16 23:34:41.922895 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.922855 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c56f4866-cjq25" Apr 16 23:34:41.922895 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.922866 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c56f4866-cjq25" event={"ID":"a868195b-60e2-4cfe-9cbf-ff92b5125a73","Type":"ContainerDied","Data":"b2031889a7407343be2ba9ff8dce02d7123f1360b0f32afc414896eb1021812c"} Apr 16 23:34:41.922993 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.922900 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c56f4866-cjq25" event={"ID":"a868195b-60e2-4cfe-9cbf-ff92b5125a73","Type":"ContainerDied","Data":"0f55a17083b258d0999901a5cd119384ef50823935f7c830f2a6cc72f9be402a"} Apr 16 23:34:41.922993 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.922916 2564 scope.go:117] "RemoveContainer" containerID="b2031889a7407343be2ba9ff8dce02d7123f1360b0f32afc414896eb1021812c" Apr 16 23:34:41.924436 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.924404 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd" event={"ID":"7d2396f2-e57a-46cb-a71a-de7349961bbd","Type":"ContainerStarted","Data":"b2575e3bd3234d7593c49b2df652bc1bdddb5bae8a6c90f08d9b0a89299f3cee"} Apr 16 23:34:41.931472 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.931452 2564 scope.go:117] "RemoveContainer" containerID="b2031889a7407343be2ba9ff8dce02d7123f1360b0f32afc414896eb1021812c" Apr 16 23:34:41.931740 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:34:41.931719 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2031889a7407343be2ba9ff8dce02d7123f1360b0f32afc414896eb1021812c\": container with ID starting with b2031889a7407343be2ba9ff8dce02d7123f1360b0f32afc414896eb1021812c not found: ID does not exist" containerID="b2031889a7407343be2ba9ff8dce02d7123f1360b0f32afc414896eb1021812c" Apr 16 23:34:41.931788 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.931749 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2031889a7407343be2ba9ff8dce02d7123f1360b0f32afc414896eb1021812c"} err="failed to get container status \"b2031889a7407343be2ba9ff8dce02d7123f1360b0f32afc414896eb1021812c\": rpc error: code = NotFound desc = could not find container \"b2031889a7407343be2ba9ff8dce02d7123f1360b0f32afc414896eb1021812c\": container with ID starting with b2031889a7407343be2ba9ff8dce02d7123f1360b0f32afc414896eb1021812c not found: ID does not exist" Apr 16 23:34:41.941460 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.940942 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfkd" podStartSLOduration=0.995387864 podStartE2EDuration="23.940908622s" podCreationTimestamp="2026-04-16 23:34:18 +0000 UTC" firstStartedPulling="2026-04-16 23:34:18.814715999 +0000 UTC m=+499.430747458" lastFinishedPulling="2026-04-16 23:34:41.760236741 +0000 UTC m=+522.376268216" observedRunningTime="2026-04-16 23:34:41.94043628 +0000 UTC m=+522.556467759" watchObservedRunningTime="2026-04-16 23:34:41.940908622 +0000 UTC m=+522.556940102" Apr 16 23:34:41.958839 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.958813 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9c56f4866-cjq25"] Apr 16 23:34:41.963816 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.963792 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9c56f4866-cjq25"] Apr 16 23:34:41.983112 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:41.983089 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a868195b-60e2-4cfe-9cbf-ff92b5125a73" path="/var/lib/kubelet/pods/a868195b-60e2-4cfe-9cbf-ff92b5125a73/volumes" Apr 16 23:34:42.021568 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:42.021541 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-serving-cert\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:34:42.021568 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:42.021564 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bjxnh\" (UniqueName: \"kubernetes.io/projected/a868195b-60e2-4cfe-9cbf-ff92b5125a73-kube-api-access-bjxnh\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:34:42.021801 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:34:42.021575 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a868195b-60e2-4cfe-9cbf-ff92b5125a73-console-oauth-config\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:35:06.460231 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.460192 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:35:06.460633 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.460540 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a868195b-60e2-4cfe-9cbf-ff92b5125a73" containerName="console" Apr 16 23:35:06.460633 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.460551 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a868195b-60e2-4cfe-9cbf-ff92b5125a73" containerName="console" Apr 16 23:35:06.460633 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.460615 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="a868195b-60e2-4cfe-9cbf-ff92b5125a73" containerName="console" Apr 16 23:35:06.489238 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.489202 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:35:06.489399 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.489326 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-r948t" Apr 16 23:35:06.490151 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.490129 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:35:06.491736 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.491715 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 23:35:06.643341 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.643301 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e60c229e-2b91-44f9-990e-ed9f6138ff01-config-file\") pod \"limitador-limitador-78c99df468-r948t\" (UID: \"e60c229e-2b91-44f9-990e-ed9f6138ff01\") " pod="kuadrant-system/limitador-limitador-78c99df468-r948t" Apr 16 23:35:06.643529 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.643358 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfbk9\" (UniqueName: \"kubernetes.io/projected/e60c229e-2b91-44f9-990e-ed9f6138ff01-kube-api-access-pfbk9\") pod \"limitador-limitador-78c99df468-r948t\" (UID: \"e60c229e-2b91-44f9-990e-ed9f6138ff01\") " pod="kuadrant-system/limitador-limitador-78c99df468-r948t" Apr 16 23:35:06.744226 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.744115 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e60c229e-2b91-44f9-990e-ed9f6138ff01-config-file\") pod \"limitador-limitador-78c99df468-r948t\" (UID: \"e60c229e-2b91-44f9-990e-ed9f6138ff01\") " pod="kuadrant-system/limitador-limitador-78c99df468-r948t" Apr 16 23:35:06.744226 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.744186 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfbk9\" (UniqueName: \"kubernetes.io/projected/e60c229e-2b91-44f9-990e-ed9f6138ff01-kube-api-access-pfbk9\") pod \"limitador-limitador-78c99df468-r948t\" (UID: \"e60c229e-2b91-44f9-990e-ed9f6138ff01\") " pod="kuadrant-system/limitador-limitador-78c99df468-r948t" Apr 16 23:35:06.744777 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.744755 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e60c229e-2b91-44f9-990e-ed9f6138ff01-config-file\") pod \"limitador-limitador-78c99df468-r948t\" (UID: \"e60c229e-2b91-44f9-990e-ed9f6138ff01\") " pod="kuadrant-system/limitador-limitador-78c99df468-r948t" Apr 16 23:35:06.753591 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.753568 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfbk9\" (UniqueName: \"kubernetes.io/projected/e60c229e-2b91-44f9-990e-ed9f6138ff01-kube-api-access-pfbk9\") pod \"limitador-limitador-78c99df468-r948t\" (UID: \"e60c229e-2b91-44f9-990e-ed9f6138ff01\") " pod="kuadrant-system/limitador-limitador-78c99df468-r948t" Apr 16 23:35:06.800476 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.800441 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-r948t" Apr 16 23:35:06.904775 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.904739 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qsdlq"] Apr 16 23:35:06.910081 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.910051 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-qsdlq" Apr 16 23:35:06.912525 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.912504 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-lc5nq\"" Apr 16 23:35:06.916946 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.916915 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qsdlq"] Apr 16 23:35:06.938312 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.938274 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:35:06.939854 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:35:06.939825 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode60c229e_2b91_44f9_990e_ed9f6138ff01.slice/crio-f4f15bd7be13a4f5674696b2eefd9b9fcd85e981fe1dd455fd4d32424938d24c WatchSource:0}: Error finding container f4f15bd7be13a4f5674696b2eefd9b9fcd85e981fe1dd455fd4d32424938d24c: Status 404 returned error can't find the container with id f4f15bd7be13a4f5674696b2eefd9b9fcd85e981fe1dd455fd4d32424938d24c Apr 16 23:35:06.946874 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:06.946796 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhvcr\" (UniqueName: \"kubernetes.io/projected/60512e10-dfd6-4024-a2e0-19a20113fb0b-kube-api-access-bhvcr\") pod \"authorino-f99f4b5cd-qsdlq\" (UID: \"60512e10-dfd6-4024-a2e0-19a20113fb0b\") " pod="kuadrant-system/authorino-f99f4b5cd-qsdlq" Apr 16 23:35:07.022324 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:07.022240 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-r948t" event={"ID":"e60c229e-2b91-44f9-990e-ed9f6138ff01","Type":"ContainerStarted","Data":"f4f15bd7be13a4f5674696b2eefd9b9fcd85e981fe1dd455fd4d32424938d24c"} Apr 16 23:35:07.047752 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:07.047719 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhvcr\" (UniqueName: \"kubernetes.io/projected/60512e10-dfd6-4024-a2e0-19a20113fb0b-kube-api-access-bhvcr\") pod \"authorino-f99f4b5cd-qsdlq\" (UID: \"60512e10-dfd6-4024-a2e0-19a20113fb0b\") " pod="kuadrant-system/authorino-f99f4b5cd-qsdlq" Apr 16 23:35:07.057089 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:07.057056 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhvcr\" (UniqueName: \"kubernetes.io/projected/60512e10-dfd6-4024-a2e0-19a20113fb0b-kube-api-access-bhvcr\") pod \"authorino-f99f4b5cd-qsdlq\" (UID: \"60512e10-dfd6-4024-a2e0-19a20113fb0b\") " pod="kuadrant-system/authorino-f99f4b5cd-qsdlq" Apr 16 23:35:07.221585 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:07.221548 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-qsdlq" Apr 16 23:35:07.348981 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:07.348954 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qsdlq"] Apr 16 23:35:07.350842 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:35:07.350808 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60512e10_dfd6_4024_a2e0_19a20113fb0b.slice/crio-0d2381a9fa023a74fdba248bf66b520c59e321aaa15497646ed76fcf367ea0bf WatchSource:0}: Error finding container 0d2381a9fa023a74fdba248bf66b520c59e321aaa15497646ed76fcf367ea0bf: Status 404 returned error can't find the container with id 0d2381a9fa023a74fdba248bf66b520c59e321aaa15497646ed76fcf367ea0bf Apr 16 23:35:08.028820 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:08.028780 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-qsdlq" event={"ID":"60512e10-dfd6-4024-a2e0-19a20113fb0b","Type":"ContainerStarted","Data":"0d2381a9fa023a74fdba248bf66b520c59e321aaa15497646ed76fcf367ea0bf"} Apr 16 23:35:11.042618 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:11.042573 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-qsdlq" event={"ID":"60512e10-dfd6-4024-a2e0-19a20113fb0b","Type":"ContainerStarted","Data":"50094f9710705c730241844a24a4380e6b877a3d3b22856b814f54194ff50bc7"} Apr 16 23:35:11.043998 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:11.043970 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-r948t" event={"ID":"e60c229e-2b91-44f9-990e-ed9f6138ff01","Type":"ContainerStarted","Data":"9fdb9bac94b0a4286ee9ae424cda142f9c0d6976aea6382437751c4fe784daad"} Apr 16 23:35:11.044132 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:11.044109 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-r948t" Apr 16 23:35:11.057737 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:11.057674 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-qsdlq" podStartSLOduration=1.9415630510000002 podStartE2EDuration="5.057660401s" podCreationTimestamp="2026-04-16 23:35:06 +0000 UTC" firstStartedPulling="2026-04-16 23:35:07.352065018 +0000 UTC m=+547.968096475" lastFinishedPulling="2026-04-16 23:35:10.468162354 +0000 UTC m=+551.084193825" observedRunningTime="2026-04-16 23:35:11.056073988 +0000 UTC m=+551.672105468" watchObservedRunningTime="2026-04-16 23:35:11.057660401 +0000 UTC m=+551.673691880" Apr 16 23:35:11.070491 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:11.070425 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-r948t" podStartSLOduration=1.488354982 podStartE2EDuration="5.070408477s" podCreationTimestamp="2026-04-16 23:35:06 +0000 UTC" firstStartedPulling="2026-04-16 23:35:06.941804438 +0000 UTC m=+547.557835897" lastFinishedPulling="2026-04-16 23:35:10.523857919 +0000 UTC m=+551.139889392" observedRunningTime="2026-04-16 23:35:11.070100956 +0000 UTC m=+551.686132445" watchObservedRunningTime="2026-04-16 23:35:11.070408477 +0000 UTC m=+551.686439957" Apr 16 23:35:11.686749 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:11.686143 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qsdlq"] Apr 16 23:35:13.052272 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:13.052228 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-qsdlq" podUID="60512e10-dfd6-4024-a2e0-19a20113fb0b" containerName="authorino" containerID="cri-o://50094f9710705c730241844a24a4380e6b877a3d3b22856b814f54194ff50bc7" gracePeriod=30 Apr 16 23:35:13.291290 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:13.291265 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-qsdlq" Apr 16 23:35:13.404264 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:13.404162 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhvcr\" (UniqueName: \"kubernetes.io/projected/60512e10-dfd6-4024-a2e0-19a20113fb0b-kube-api-access-bhvcr\") pod \"60512e10-dfd6-4024-a2e0-19a20113fb0b\" (UID: \"60512e10-dfd6-4024-a2e0-19a20113fb0b\") " Apr 16 23:35:13.406259 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:13.406231 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60512e10-dfd6-4024-a2e0-19a20113fb0b-kube-api-access-bhvcr" (OuterVolumeSpecName: "kube-api-access-bhvcr") pod "60512e10-dfd6-4024-a2e0-19a20113fb0b" (UID: "60512e10-dfd6-4024-a2e0-19a20113fb0b"). InnerVolumeSpecName "kube-api-access-bhvcr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:35:13.505000 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:13.504957 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bhvcr\" (UniqueName: \"kubernetes.io/projected/60512e10-dfd6-4024-a2e0-19a20113fb0b-kube-api-access-bhvcr\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:35:14.057654 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:14.057619 2564 generic.go:358] "Generic (PLEG): container finished" podID="60512e10-dfd6-4024-a2e0-19a20113fb0b" containerID="50094f9710705c730241844a24a4380e6b877a3d3b22856b814f54194ff50bc7" exitCode=0 Apr 16 23:35:14.058095 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:14.057670 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-qsdlq" Apr 16 23:35:14.058095 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:14.057717 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-qsdlq" event={"ID":"60512e10-dfd6-4024-a2e0-19a20113fb0b","Type":"ContainerDied","Data":"50094f9710705c730241844a24a4380e6b877a3d3b22856b814f54194ff50bc7"} Apr 16 23:35:14.058095 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:14.057759 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-qsdlq" event={"ID":"60512e10-dfd6-4024-a2e0-19a20113fb0b","Type":"ContainerDied","Data":"0d2381a9fa023a74fdba248bf66b520c59e321aaa15497646ed76fcf367ea0bf"} Apr 16 23:35:14.058095 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:14.057778 2564 scope.go:117] "RemoveContainer" containerID="50094f9710705c730241844a24a4380e6b877a3d3b22856b814f54194ff50bc7" Apr 16 23:35:14.066427 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:14.066410 2564 scope.go:117] "RemoveContainer" containerID="50094f9710705c730241844a24a4380e6b877a3d3b22856b814f54194ff50bc7" Apr 16 23:35:14.066812 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:35:14.066788 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50094f9710705c730241844a24a4380e6b877a3d3b22856b814f54194ff50bc7\": container with ID starting with 50094f9710705c730241844a24a4380e6b877a3d3b22856b814f54194ff50bc7 not found: ID does not exist" containerID="50094f9710705c730241844a24a4380e6b877a3d3b22856b814f54194ff50bc7" Apr 16 23:35:14.066912 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:14.066817 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50094f9710705c730241844a24a4380e6b877a3d3b22856b814f54194ff50bc7"} err="failed to get container status \"50094f9710705c730241844a24a4380e6b877a3d3b22856b814f54194ff50bc7\": rpc error: code = NotFound desc = could not find container \"50094f9710705c730241844a24a4380e6b877a3d3b22856b814f54194ff50bc7\": container with ID starting with 50094f9710705c730241844a24a4380e6b877a3d3b22856b814f54194ff50bc7 not found: ID does not exist" Apr 16 23:35:14.073073 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:14.073047 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qsdlq"] Apr 16 23:35:14.075241 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:14.075218 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qsdlq"] Apr 16 23:35:15.983060 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:15.983029 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60512e10-dfd6-4024-a2e0-19a20113fb0b" path="/var/lib/kubelet/pods/60512e10-dfd6-4024-a2e0-19a20113fb0b/volumes" Apr 16 23:35:22.049211 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:22.049175 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-r948t" Apr 16 23:35:40.780812 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:40.780771 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ch9t2"] Apr 16 23:35:40.781192 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:40.781121 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60512e10-dfd6-4024-a2e0-19a20113fb0b" containerName="authorino" Apr 16 23:35:40.781192 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:40.781132 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="60512e10-dfd6-4024-a2e0-19a20113fb0b" containerName="authorino" Apr 16 23:35:40.781192 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:40.781190 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="60512e10-dfd6-4024-a2e0-19a20113fb0b" containerName="authorino" Apr 16 23:35:40.784172 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:40.784155 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-ch9t2" Apr 16 23:35:40.787481 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:40.787458 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-lc5nq\"" Apr 16 23:35:40.799570 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:40.799521 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ch9t2"] Apr 16 23:35:40.936854 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:40.936820 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r8hp\" (UniqueName: \"kubernetes.io/projected/e1239aa4-4544-4ce5-91c1-cc5058d174bd-kube-api-access-2r8hp\") pod \"authorino-8b475cf9f-ch9t2\" (UID: \"e1239aa4-4544-4ce5-91c1-cc5058d174bd\") " pod="kuadrant-system/authorino-8b475cf9f-ch9t2" Apr 16 23:35:41.004585 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.004552 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:35:41.038373 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.038289 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2r8hp\" (UniqueName: \"kubernetes.io/projected/e1239aa4-4544-4ce5-91c1-cc5058d174bd-kube-api-access-2r8hp\") pod \"authorino-8b475cf9f-ch9t2\" (UID: \"e1239aa4-4544-4ce5-91c1-cc5058d174bd\") " pod="kuadrant-system/authorino-8b475cf9f-ch9t2" Apr 16 23:35:41.056592 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.056561 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ch9t2"] Apr 16 23:35:41.056836 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:35:41.056815 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-2r8hp], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-ch9t2" podUID="e1239aa4-4544-4ce5-91c1-cc5058d174bd" Apr 16 23:35:41.062099 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.062074 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r8hp\" (UniqueName: \"kubernetes.io/projected/e1239aa4-4544-4ce5-91c1-cc5058d174bd-kube-api-access-2r8hp\") pod \"authorino-8b475cf9f-ch9t2\" (UID: \"e1239aa4-4544-4ce5-91c1-cc5058d174bd\") " pod="kuadrant-system/authorino-8b475cf9f-ch9t2" Apr 16 23:35:41.160017 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.159984 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-ch9t2" Apr 16 23:35:41.165089 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.165064 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-ch9t2" Apr 16 23:35:41.239380 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.239351 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r8hp\" (UniqueName: \"kubernetes.io/projected/e1239aa4-4544-4ce5-91c1-cc5058d174bd-kube-api-access-2r8hp\") pod \"e1239aa4-4544-4ce5-91c1-cc5058d174bd\" (UID: \"e1239aa4-4544-4ce5-91c1-cc5058d174bd\") " Apr 16 23:35:41.241515 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.241483 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1239aa4-4544-4ce5-91c1-cc5058d174bd-kube-api-access-2r8hp" (OuterVolumeSpecName: "kube-api-access-2r8hp") pod "e1239aa4-4544-4ce5-91c1-cc5058d174bd" (UID: "e1239aa4-4544-4ce5-91c1-cc5058d174bd"). InnerVolumeSpecName "kube-api-access-2r8hp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:35:41.340467 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.340380 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2r8hp\" (UniqueName: \"kubernetes.io/projected/e1239aa4-4544-4ce5-91c1-cc5058d174bd-kube-api-access-2r8hp\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:35:41.353340 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.353310 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-bdcbc7554-hbjz2"] Apr 16 23:35:41.356893 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.356877 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-bdcbc7554-hbjz2" Apr 16 23:35:41.359559 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.359541 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 23:35:41.379985 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.379954 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-bdcbc7554-hbjz2"] Apr 16 23:35:41.441793 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.441762 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0ef33c98-0f3c-4967-af32-4ecf66eab6e4-tls-cert\") pod \"authorino-bdcbc7554-hbjz2\" (UID: \"0ef33c98-0f3c-4967-af32-4ecf66eab6e4\") " pod="kuadrant-system/authorino-bdcbc7554-hbjz2" Apr 16 23:35:41.441793 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.441800 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxx2s\" (UniqueName: \"kubernetes.io/projected/0ef33c98-0f3c-4967-af32-4ecf66eab6e4-kube-api-access-mxx2s\") pod \"authorino-bdcbc7554-hbjz2\" (UID: \"0ef33c98-0f3c-4967-af32-4ecf66eab6e4\") " pod="kuadrant-system/authorino-bdcbc7554-hbjz2" Apr 16 23:35:41.542371 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.542331 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxx2s\" (UniqueName: \"kubernetes.io/projected/0ef33c98-0f3c-4967-af32-4ecf66eab6e4-kube-api-access-mxx2s\") pod \"authorino-bdcbc7554-hbjz2\" (UID: \"0ef33c98-0f3c-4967-af32-4ecf66eab6e4\") " pod="kuadrant-system/authorino-bdcbc7554-hbjz2" Apr 16 23:35:41.542530 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.542454 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0ef33c98-0f3c-4967-af32-4ecf66eab6e4-tls-cert\") pod \"authorino-bdcbc7554-hbjz2\" (UID: \"0ef33c98-0f3c-4967-af32-4ecf66eab6e4\") " pod="kuadrant-system/authorino-bdcbc7554-hbjz2" Apr 16 23:35:41.545835 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.545809 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0ef33c98-0f3c-4967-af32-4ecf66eab6e4-tls-cert\") pod \"authorino-bdcbc7554-hbjz2\" (UID: \"0ef33c98-0f3c-4967-af32-4ecf66eab6e4\") " pod="kuadrant-system/authorino-bdcbc7554-hbjz2" Apr 16 23:35:41.551366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.551342 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxx2s\" (UniqueName: \"kubernetes.io/projected/0ef33c98-0f3c-4967-af32-4ecf66eab6e4-kube-api-access-mxx2s\") pod \"authorino-bdcbc7554-hbjz2\" (UID: \"0ef33c98-0f3c-4967-af32-4ecf66eab6e4\") " pod="kuadrant-system/authorino-bdcbc7554-hbjz2" Apr 16 23:35:41.669498 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.669415 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-bdcbc7554-hbjz2" Apr 16 23:35:41.808367 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:41.808342 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-bdcbc7554-hbjz2"] Apr 16 23:35:41.809801 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:35:41.809768 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ef33c98_0f3c_4967_af32_4ecf66eab6e4.slice/crio-5fb92da59e89dbef89dfc988a2da7861e2b7a0bbfc168f5f8dd4d021a50e7239 WatchSource:0}: Error finding container 5fb92da59e89dbef89dfc988a2da7861e2b7a0bbfc168f5f8dd4d021a50e7239: Status 404 returned error can't find the container with id 5fb92da59e89dbef89dfc988a2da7861e2b7a0bbfc168f5f8dd4d021a50e7239 Apr 16 23:35:42.169687 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:42.169649 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-ch9t2" Apr 16 23:35:42.169869 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:42.169656 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-bdcbc7554-hbjz2" event={"ID":"0ef33c98-0f3c-4967-af32-4ecf66eab6e4","Type":"ContainerStarted","Data":"5fb92da59e89dbef89dfc988a2da7861e2b7a0bbfc168f5f8dd4d021a50e7239"} Apr 16 23:35:42.215961 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:42.215911 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ch9t2"] Apr 16 23:35:42.220091 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:42.220064 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ch9t2"] Apr 16 23:35:43.174849 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:43.174810 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-bdcbc7554-hbjz2" event={"ID":"0ef33c98-0f3c-4967-af32-4ecf66eab6e4","Type":"ContainerStarted","Data":"3ba4ce903cbdd1b994e21604c3c5cfa4633d320c5c91989821e8ee35dad9227b"} Apr 16 23:35:43.196503 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:43.196454 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-bdcbc7554-hbjz2" podStartSLOduration=1.811654507 podStartE2EDuration="2.196438006s" podCreationTimestamp="2026-04-16 23:35:41 +0000 UTC" firstStartedPulling="2026-04-16 23:35:41.81116888 +0000 UTC m=+582.427200338" lastFinishedPulling="2026-04-16 23:35:42.195952365 +0000 UTC m=+582.811983837" observedRunningTime="2026-04-16 23:35:43.193946176 +0000 UTC m=+583.809977656" watchObservedRunningTime="2026-04-16 23:35:43.196438006 +0000 UTC m=+583.812469524" Apr 16 23:35:43.982462 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:43.982428 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1239aa4-4544-4ce5-91c1-cc5058d174bd" path="/var/lib/kubelet/pods/e1239aa4-4544-4ce5-91c1-cc5058d174bd/volumes" Apr 16 23:35:59.922110 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:59.922081 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 16 23:35:59.922650 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:59.922456 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 16 23:35:59.925278 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:59.925256 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:35:59.925911 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:35:59.925890 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:36:45.292560 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:36:45.292519 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:36:47.794523 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:36:47.794484 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:37:06.395664 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:37:06.395627 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:37:16.190764 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:37:16.190720 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:37:42.584466 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:37:42.584385 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:37:56.289477 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:37:56.289438 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:38:28.893728 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:28.893679 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5565f6cccf-qwgk4"] Apr 16 23:38:28.896378 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:28.896360 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5565f6cccf-qwgk4" Apr 16 23:38:28.903681 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:28.903651 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5565f6cccf-qwgk4"] Apr 16 23:38:28.984887 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:28.984844 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a09c1586-0cb9-49ea-9f41-2b4831e94d6f-tls-cert\") pod \"authorino-5565f6cccf-qwgk4\" (UID: \"a09c1586-0cb9-49ea-9f41-2b4831e94d6f\") " pod="kuadrant-system/authorino-5565f6cccf-qwgk4" Apr 16 23:38:28.985069 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:28.984918 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgjpt\" (UniqueName: \"kubernetes.io/projected/a09c1586-0cb9-49ea-9f41-2b4831e94d6f-kube-api-access-vgjpt\") pod \"authorino-5565f6cccf-qwgk4\" (UID: \"a09c1586-0cb9-49ea-9f41-2b4831e94d6f\") " pod="kuadrant-system/authorino-5565f6cccf-qwgk4" Apr 16 23:38:29.085500 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:29.085466 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a09c1586-0cb9-49ea-9f41-2b4831e94d6f-tls-cert\") pod \"authorino-5565f6cccf-qwgk4\" (UID: \"a09c1586-0cb9-49ea-9f41-2b4831e94d6f\") " pod="kuadrant-system/authorino-5565f6cccf-qwgk4" Apr 16 23:38:29.085679 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:29.085562 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgjpt\" (UniqueName: \"kubernetes.io/projected/a09c1586-0cb9-49ea-9f41-2b4831e94d6f-kube-api-access-vgjpt\") pod \"authorino-5565f6cccf-qwgk4\" (UID: \"a09c1586-0cb9-49ea-9f41-2b4831e94d6f\") " pod="kuadrant-system/authorino-5565f6cccf-qwgk4" Apr 16 23:38:29.088045 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:29.088017 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a09c1586-0cb9-49ea-9f41-2b4831e94d6f-tls-cert\") pod \"authorino-5565f6cccf-qwgk4\" (UID: \"a09c1586-0cb9-49ea-9f41-2b4831e94d6f\") " pod="kuadrant-system/authorino-5565f6cccf-qwgk4" Apr 16 23:38:29.092863 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:29.092839 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgjpt\" (UniqueName: \"kubernetes.io/projected/a09c1586-0cb9-49ea-9f41-2b4831e94d6f-kube-api-access-vgjpt\") pod \"authorino-5565f6cccf-qwgk4\" (UID: \"a09c1586-0cb9-49ea-9f41-2b4831e94d6f\") " pod="kuadrant-system/authorino-5565f6cccf-qwgk4" Apr 16 23:38:29.206768 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:29.206732 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5565f6cccf-qwgk4" Apr 16 23:38:29.333101 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:29.333066 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5565f6cccf-qwgk4"] Apr 16 23:38:29.336624 ip-10-0-131-43 kubenswrapper[2564]: W0416 23:38:29.336583 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda09c1586_0cb9_49ea_9f41_2b4831e94d6f.slice/crio-7846e3178a5593e1bade45360874ee637994fc2c53bd3f1269e823bfa6fc2cf9 WatchSource:0}: Error finding container 7846e3178a5593e1bade45360874ee637994fc2c53bd3f1269e823bfa6fc2cf9: Status 404 returned error can't find the container with id 7846e3178a5593e1bade45360874ee637994fc2c53bd3f1269e823bfa6fc2cf9 Apr 16 23:38:29.337860 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:29.337841 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:38:29.805665 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:29.805629 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5565f6cccf-qwgk4" event={"ID":"a09c1586-0cb9-49ea-9f41-2b4831e94d6f","Type":"ContainerStarted","Data":"7846e3178a5593e1bade45360874ee637994fc2c53bd3f1269e823bfa6fc2cf9"} Apr 16 23:38:30.814743 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:30.814650 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5565f6cccf-qwgk4" event={"ID":"a09c1586-0cb9-49ea-9f41-2b4831e94d6f","Type":"ContainerStarted","Data":"4ca855cf7692b4df138392d684561a7dbb150216299740489ff38b9b71b96fdc"} Apr 16 23:38:30.830409 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:30.830357 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5565f6cccf-qwgk4" podStartSLOduration=2.336553039 podStartE2EDuration="2.830341877s" podCreationTimestamp="2026-04-16 23:38:28 +0000 UTC" firstStartedPulling="2026-04-16 23:38:29.337962447 +0000 UTC m=+749.953993903" lastFinishedPulling="2026-04-16 23:38:29.83175128 +0000 UTC m=+750.447782741" observedRunningTime="2026-04-16 23:38:30.828586053 +0000 UTC m=+751.444617533" watchObservedRunningTime="2026-04-16 23:38:30.830341877 +0000 UTC m=+751.446373413" Apr 16 23:38:30.859445 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:30.859406 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-bdcbc7554-hbjz2"] Apr 16 23:38:30.859709 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:30.859671 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-bdcbc7554-hbjz2" podUID="0ef33c98-0f3c-4967-af32-4ecf66eab6e4" containerName="authorino" containerID="cri-o://3ba4ce903cbdd1b994e21604c3c5cfa4633d320c5c91989821e8ee35dad9227b" gracePeriod=30 Apr 16 23:38:31.106120 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.106097 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-bdcbc7554-hbjz2" Apr 16 23:38:31.203923 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.203886 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxx2s\" (UniqueName: \"kubernetes.io/projected/0ef33c98-0f3c-4967-af32-4ecf66eab6e4-kube-api-access-mxx2s\") pod \"0ef33c98-0f3c-4967-af32-4ecf66eab6e4\" (UID: \"0ef33c98-0f3c-4967-af32-4ecf66eab6e4\") " Apr 16 23:38:31.204095 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.203980 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0ef33c98-0f3c-4967-af32-4ecf66eab6e4-tls-cert\") pod \"0ef33c98-0f3c-4967-af32-4ecf66eab6e4\" (UID: \"0ef33c98-0f3c-4967-af32-4ecf66eab6e4\") " Apr 16 23:38:31.206085 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.206049 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ef33c98-0f3c-4967-af32-4ecf66eab6e4-kube-api-access-mxx2s" (OuterVolumeSpecName: "kube-api-access-mxx2s") pod "0ef33c98-0f3c-4967-af32-4ecf66eab6e4" (UID: "0ef33c98-0f3c-4967-af32-4ecf66eab6e4"). InnerVolumeSpecName "kube-api-access-mxx2s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:38:31.214204 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.214181 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef33c98-0f3c-4967-af32-4ecf66eab6e4-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "0ef33c98-0f3c-4967-af32-4ecf66eab6e4" (UID: "0ef33c98-0f3c-4967-af32-4ecf66eab6e4"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:38:31.304642 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.304607 2564 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0ef33c98-0f3c-4967-af32-4ecf66eab6e4-tls-cert\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:38:31.304642 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.304636 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mxx2s\" (UniqueName: \"kubernetes.io/projected/0ef33c98-0f3c-4967-af32-4ecf66eab6e4-kube-api-access-mxx2s\") on node \"ip-10-0-131-43.ec2.internal\" DevicePath \"\"" Apr 16 23:38:31.819534 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.819498 2564 generic.go:358] "Generic (PLEG): container finished" podID="0ef33c98-0f3c-4967-af32-4ecf66eab6e4" containerID="3ba4ce903cbdd1b994e21604c3c5cfa4633d320c5c91989821e8ee35dad9227b" exitCode=0 Apr 16 23:38:31.819970 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.819552 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-bdcbc7554-hbjz2" Apr 16 23:38:31.819970 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.819586 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-bdcbc7554-hbjz2" event={"ID":"0ef33c98-0f3c-4967-af32-4ecf66eab6e4","Type":"ContainerDied","Data":"3ba4ce903cbdd1b994e21604c3c5cfa4633d320c5c91989821e8ee35dad9227b"} Apr 16 23:38:31.819970 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.819631 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-bdcbc7554-hbjz2" event={"ID":"0ef33c98-0f3c-4967-af32-4ecf66eab6e4","Type":"ContainerDied","Data":"5fb92da59e89dbef89dfc988a2da7861e2b7a0bbfc168f5f8dd4d021a50e7239"} Apr 16 23:38:31.819970 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.819647 2564 scope.go:117] "RemoveContainer" containerID="3ba4ce903cbdd1b994e21604c3c5cfa4633d320c5c91989821e8ee35dad9227b" Apr 16 23:38:31.829534 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.829517 2564 scope.go:117] "RemoveContainer" containerID="3ba4ce903cbdd1b994e21604c3c5cfa4633d320c5c91989821e8ee35dad9227b" Apr 16 23:38:31.829818 ip-10-0-131-43 kubenswrapper[2564]: E0416 23:38:31.829794 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ba4ce903cbdd1b994e21604c3c5cfa4633d320c5c91989821e8ee35dad9227b\": container with ID starting with 3ba4ce903cbdd1b994e21604c3c5cfa4633d320c5c91989821e8ee35dad9227b not found: ID does not exist" containerID="3ba4ce903cbdd1b994e21604c3c5cfa4633d320c5c91989821e8ee35dad9227b" Apr 16 23:38:31.829871 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.829828 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba4ce903cbdd1b994e21604c3c5cfa4633d320c5c91989821e8ee35dad9227b"} err="failed to get container status \"3ba4ce903cbdd1b994e21604c3c5cfa4633d320c5c91989821e8ee35dad9227b\": rpc error: code = NotFound desc = could not find container \"3ba4ce903cbdd1b994e21604c3c5cfa4633d320c5c91989821e8ee35dad9227b\": container with ID starting with 3ba4ce903cbdd1b994e21604c3c5cfa4633d320c5c91989821e8ee35dad9227b not found: ID does not exist" Apr 16 23:38:31.839829 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.839789 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-bdcbc7554-hbjz2"] Apr 16 23:38:31.842008 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.841983 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-bdcbc7554-hbjz2"] Apr 16 23:38:31.983309 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:31.983276 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ef33c98-0f3c-4967-af32-4ecf66eab6e4" path="/var/lib/kubelet/pods/0ef33c98-0f3c-4967-af32-4ecf66eab6e4/volumes" Apr 16 23:38:56.084192 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:38:56.084147 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:39:06.777579 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:39:06.777528 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:39:15.987433 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:39:15.987354 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:39:26.087398 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:39:26.087364 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:39:34.585510 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:39:34.585461 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:39:44.783379 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:39:44.783341 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:40:47.289963 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:40:47.289861 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:40:59.954341 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:40:59.954315 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 16 23:40:59.956466 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:40:59.956445 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 16 23:40:59.957155 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:40:59.957135 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:40:59.959607 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:40:59.959587 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:41:02.791412 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:41:02.791372 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:41:40.891712 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:41:40.891662 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:41:57.785302 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:41:57.785265 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:42:11.885181 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:42:11.885079 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:42:28.783948 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:42:28.783912 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:42:55.589595 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:42:55.589556 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:43:00.282289 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:43:00.282244 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:43:22.483517 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:43:22.483478 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:43:30.588183 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:43:30.588146 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:43:47.783268 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:43:47.783184 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:43:56.481408 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:43:56.481365 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:44:13.386734 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:44:13.386684 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:44:21.192662 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:44:21.192618 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:44:54.682951 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:44:54.682913 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:45:03.581365 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:45:03.581329 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:45:10.882406 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:45:10.882318 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:45:19.391015 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:45:19.390978 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:45:28.294089 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:45:28.294054 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:45:44.712879 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:45:44.712839 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:45:57.875357 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:45:57.875315 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:45:59.989754 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:45:59.989723 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 16 23:45:59.992086 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:45:59.992067 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 16 23:45:59.992764 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:45:59.992745 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:45:59.994948 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:45:59.994930 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:46:44.792471 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:46:44.792437 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:46:52.487074 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:46:52.487032 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:47:01.410574 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:47:01.410531 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:47:10.629192 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:47:10.629152 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:47:19.684791 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:47:19.684757 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:47:27.795770 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:47:27.795735 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:47:36.988534 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:47:36.988489 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:47:44.799736 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:47:44.799682 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:47:53.795532 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:47:53.795497 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:48:02.282950 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:48:02.282909 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:48:11.589144 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:48:11.589052 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:48:19.781620 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:48:19.781585 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:48:29.488194 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:48:29.488148 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:48:37.389923 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:48:37.389882 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:48:46.488437 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:48:46.488396 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:48:54.690088 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:48:54.690040 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:49:03.592811 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:49:03.592765 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:49:12.189880 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:49:12.189838 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:51:00.020185 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:51:00.020155 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 16 23:51:00.023130 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:51:00.023107 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:51:00.024172 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:51:00.024136 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 16 23:51:00.026977 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:51:00.026957 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:51:28.288412 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:51:28.288374 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:51:35.585774 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:51:35.585728 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:52:00.388339 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:52:00.388299 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:52:07.180676 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:52:07.180637 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:52:16.591980 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:52:16.591938 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:52:27.587454 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:52:27.587410 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:52:35.585655 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:52:35.585611 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:52:46.582086 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:52:46.581998 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:52:55.285105 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:52:55.285060 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:53:05.880116 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:53:05.880078 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:53:14.083584 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:53:14.083547 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:53:25.285680 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:53:25.285643 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:53:34.498006 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:53:34.497956 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:54:06.685366 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:54:06.685330 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:54:49.914276 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:54:49.914229 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:54:58.093579 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:54:58.093531 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:55:06.589594 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:55:06.589554 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:55:15.385498 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:55:15.385457 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:55:24.683570 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:55:24.683533 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:55:35.077482 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:55:35.077440 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:55:42.693199 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:55:42.693111 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:55:51.390539 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:55:51.390502 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:55:59.087016 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:55:59.086977 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:56:00.056321 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:56:00.056285 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 16 23:56:00.059494 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:56:00.059469 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:56:00.064939 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:56:00.064911 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 16 23:56:00.068195 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:56:00.068173 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 16 23:56:07.586673 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:56:07.586621 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:56:15.983503 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:56:15.983465 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:56:26.989150 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:56:26.989105 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:56:43.793608 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:56:43.793564 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:56:53.287720 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:56:53.287664 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:57:00.891368 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:57:00.891331 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:57:09.983621 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:57:09.983548 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:57:26.884042 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:57:26.884004 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:57:34.686429 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:57:34.686391 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:57:43.987566 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:57:43.987535 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:57:51.888812 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:57:51.888771 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:58:01.088207 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:58:01.088168 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:58:09.281249 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:58:09.281207 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:58:20.077800 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:58:20.077766 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:58:30.585996 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:58:30.585955 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:58:39.988893 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:58:39.988806 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:58:50.878430 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:58:50.878392 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:58:58.591692 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:58:58.591650 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:59:06.085282 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:59:06.085247 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:59:15.385920 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:59:15.385877 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:59:23.286910 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:59:23.286873 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:59:39.681388 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:59:39.681350 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:59:47.682244 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:59:47.682210 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 16 23:59:56.683048 ip-10-0-131-43 kubenswrapper[2564]: I0416 23:59:56.683014 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 17 00:00:04.790455 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:04.790421 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 17 00:00:29.783053 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:29.783013 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 17 00:00:42.086399 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:42.086360 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-r948t"] Apr 17 00:00:43.652894 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:43.652833 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5565f6cccf-qwgk4_a09c1586-0cb9-49ea-9f41-2b4831e94d6f/authorino/0.log" Apr 17 00:00:47.922998 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:47.922963 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-8bf69b96d-9t79m_c684f7a7-cc33-43fd-993c-56565eeab0ca/manager/0.log" Apr 17 00:00:48.986331 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:48.986300 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs_ad4bf82c-9cdd-4e1c-805d-88353fa74ad6/util/0.log" Apr 17 00:00:48.991591 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:48.991567 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs_ad4bf82c-9cdd-4e1c-805d-88353fa74ad6/pull/0.log" Apr 17 00:00:48.996452 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:48.996437 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs_ad4bf82c-9cdd-4e1c-805d-88353fa74ad6/extract/0.log" Apr 17 00:00:49.100672 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:49.100641 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8_797c63f2-40b6-41f4-95ec-c57ed738695a/pull/0.log" Apr 17 00:00:49.105889 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:49.105864 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8_797c63f2-40b6-41f4-95ec-c57ed738695a/extract/0.log" Apr 17 00:00:49.110935 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:49.110913 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8_797c63f2-40b6-41f4-95ec-c57ed738695a/util/0.log" Apr 17 00:00:49.213168 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:49.213140 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss_221ea960-cf1a-46e4-a80b-8a7bd2d9b84a/util/0.log" Apr 17 00:00:49.218452 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:49.218431 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss_221ea960-cf1a-46e4-a80b-8a7bd2d9b84a/pull/0.log" Apr 17 00:00:49.223541 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:49.223523 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss_221ea960-cf1a-46e4-a80b-8a7bd2d9b84a/extract/0.log" Apr 17 00:00:49.331578 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:49.331509 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg_67a674cd-ba01-41f4-877c-55adb3dd845c/util/0.log" Apr 17 00:00:49.339525 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:49.339501 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg_67a674cd-ba01-41f4-877c-55adb3dd845c/pull/0.log" Apr 17 00:00:49.345068 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:49.345041 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg_67a674cd-ba01-41f4-877c-55adb3dd845c/extract/0.log" Apr 17 00:00:49.464088 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:49.464057 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5565f6cccf-qwgk4_a09c1586-0cb9-49ea-9f41-2b4831e94d6f/authorino/0.log" Apr 17 00:00:49.782726 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:49.782674 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-klfkd_7d2396f2-e57a-46cb-a71a-de7349961bbd/kuadrant-console-plugin/0.log" Apr 17 00:00:50.108260 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:50.108177 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-r948t_e60c229e-2b91-44f9-990e-ed9f6138ff01/limitador/0.log" Apr 17 00:00:50.859784 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:50.859756 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-577868f455-sqcr4_4767bdaf-3083-4960-9ed8-e4ad26613b2d/kube-auth-proxy/0.log" Apr 17 00:00:55.430333 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.430296 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zs59b/must-gather-ccfng"] Apr 17 00:00:55.430769 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.430667 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ef33c98-0f3c-4967-af32-4ecf66eab6e4" containerName="authorino" Apr 17 00:00:55.430769 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.430678 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ef33c98-0f3c-4967-af32-4ecf66eab6e4" containerName="authorino" Apr 17 00:00:55.430769 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.430747 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ef33c98-0f3c-4967-af32-4ecf66eab6e4" containerName="authorino" Apr 17 00:00:55.433754 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.433737 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs59b/must-gather-ccfng" Apr 17 00:00:55.435990 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.435969 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zs59b\"/\"openshift-service-ca.crt\"" Apr 17 00:00:55.436665 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.436649 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zs59b\"/\"default-dockercfg-rtvn7\"" Apr 17 00:00:55.436769 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.436735 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zs59b\"/\"kube-root-ca.crt\"" Apr 17 00:00:55.441967 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.441947 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zs59b/must-gather-ccfng"] Apr 17 00:00:55.494803 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.494771 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnf69\" (UniqueName: \"kubernetes.io/projected/7daee524-e7dd-4548-9b8e-778b76475376-kube-api-access-tnf69\") pod \"must-gather-ccfng\" (UID: \"7daee524-e7dd-4548-9b8e-778b76475376\") " pod="openshift-must-gather-zs59b/must-gather-ccfng" Apr 17 00:00:55.494972 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.494825 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7daee524-e7dd-4548-9b8e-778b76475376-must-gather-output\") pod \"must-gather-ccfng\" (UID: \"7daee524-e7dd-4548-9b8e-778b76475376\") " pod="openshift-must-gather-zs59b/must-gather-ccfng" Apr 17 00:00:55.595503 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.595475 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnf69\" (UniqueName: \"kubernetes.io/projected/7daee524-e7dd-4548-9b8e-778b76475376-kube-api-access-tnf69\") pod \"must-gather-ccfng\" (UID: \"7daee524-e7dd-4548-9b8e-778b76475376\") " pod="openshift-must-gather-zs59b/must-gather-ccfng" Apr 17 00:00:55.595718 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.595521 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7daee524-e7dd-4548-9b8e-778b76475376-must-gather-output\") pod \"must-gather-ccfng\" (UID: \"7daee524-e7dd-4548-9b8e-778b76475376\") " pod="openshift-must-gather-zs59b/must-gather-ccfng" Apr 17 00:00:55.595857 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.595828 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7daee524-e7dd-4548-9b8e-778b76475376-must-gather-output\") pod \"must-gather-ccfng\" (UID: \"7daee524-e7dd-4548-9b8e-778b76475376\") " pod="openshift-must-gather-zs59b/must-gather-ccfng" Apr 17 00:00:55.603408 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.603384 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnf69\" (UniqueName: \"kubernetes.io/projected/7daee524-e7dd-4548-9b8e-778b76475376-kube-api-access-tnf69\") pod \"must-gather-ccfng\" (UID: \"7daee524-e7dd-4548-9b8e-778b76475376\") " pod="openshift-must-gather-zs59b/must-gather-ccfng" Apr 17 00:00:55.743653 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.743614 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs59b/must-gather-ccfng" Apr 17 00:00:55.866786 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.866757 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zs59b/must-gather-ccfng"] Apr 17 00:00:55.868684 ip-10-0-131-43 kubenswrapper[2564]: W0417 00:00:55.868656 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7daee524_e7dd_4548_9b8e_778b76475376.slice/crio-09bf7493b77884df18f085e5294300477d366fff681efb3baf79ebde32301f66 WatchSource:0}: Error finding container 09bf7493b77884df18f085e5294300477d366fff681efb3baf79ebde32301f66: Status 404 returned error can't find the container with id 09bf7493b77884df18f085e5294300477d366fff681efb3baf79ebde32301f66 Apr 17 00:00:55.870438 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.870419 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 00:00:55.989330 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:55.989293 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs59b/must-gather-ccfng" event={"ID":"7daee524-e7dd-4548-9b8e-778b76475376","Type":"ContainerStarted","Data":"09bf7493b77884df18f085e5294300477d366fff681efb3baf79ebde32301f66"} Apr 17 00:00:58.000722 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:58.000637 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs59b/must-gather-ccfng" event={"ID":"7daee524-e7dd-4548-9b8e-778b76475376","Type":"ContainerStarted","Data":"648ffedfa217795ba42bec3e9990a3e6955392411f13af275e064e260737b97f"} Apr 17 00:00:59.007616 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:59.007582 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs59b/must-gather-ccfng" event={"ID":"7daee524-e7dd-4548-9b8e-778b76475376","Type":"ContainerStarted","Data":"60797a0000a8f2be9ffc8a41e6e1cf2dcd9aed269171b0abcd544800d45e40ac"} Apr 17 00:00:59.026058 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:59.025993 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zs59b/must-gather-ccfng" podStartSLOduration=2.119684579 podStartE2EDuration="4.025973822s" podCreationTimestamp="2026-04-17 00:00:55 +0000 UTC" firstStartedPulling="2026-04-17 00:00:55.870548508 +0000 UTC m=+2096.486579965" lastFinishedPulling="2026-04-17 00:00:57.776837751 +0000 UTC m=+2098.392869208" observedRunningTime="2026-04-17 00:00:59.024958793 +0000 UTC m=+2099.640990275" watchObservedRunningTime="2026-04-17 00:00:59.025973822 +0000 UTC m=+2099.642005302" Apr 17 00:00:59.429201 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:59.429115 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bhg8n_30de2f08-6583-4f71-b865-e6f57f20268c/global-pull-secret-syncer/0.log" Apr 17 00:00:59.629056 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:59.629022 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-wmsm7_c7fc9336-0583-4750-b4d1-143cb0e8e3bf/konnectivity-agent/0.log" Apr 17 00:00:59.668930 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:00:59.668897 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-43.ec2.internal_8c6c197c2824d262900a926e3ff6a96c/haproxy/0.log" Apr 17 00:01:00.118613 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:00.118580 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 17 00:01:00.121180 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:00.121137 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 17 00:01:00.122979 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:00.122768 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 17 00:01:00.125505 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:00.125476 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 17 00:01:03.715292 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:03.715240 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs_ad4bf82c-9cdd-4e1c-805d-88353fa74ad6/extract/0.log" Apr 17 00:01:03.735898 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:03.735869 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs_ad4bf82c-9cdd-4e1c-805d-88353fa74ad6/util/0.log" Apr 17 00:01:03.756475 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:03.756441 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tdzvs_ad4bf82c-9cdd-4e1c-805d-88353fa74ad6/pull/0.log" Apr 17 00:01:03.790181 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:03.790108 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8_797c63f2-40b6-41f4-95ec-c57ed738695a/extract/0.log" Apr 17 00:01:03.809793 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:03.809761 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8_797c63f2-40b6-41f4-95ec-c57ed738695a/util/0.log" Apr 17 00:01:03.829764 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:03.829638 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xnkv8_797c63f2-40b6-41f4-95ec-c57ed738695a/pull/0.log" Apr 17 00:01:03.855328 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:03.855258 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss_221ea960-cf1a-46e4-a80b-8a7bd2d9b84a/extract/0.log" Apr 17 00:01:03.879752 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:03.879682 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss_221ea960-cf1a-46e4-a80b-8a7bd2d9b84a/util/0.log" Apr 17 00:01:03.901543 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:03.901515 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73k9pss_221ea960-cf1a-46e4-a80b-8a7bd2d9b84a/pull/0.log" Apr 17 00:01:03.932655 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:03.932615 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg_67a674cd-ba01-41f4-877c-55adb3dd845c/extract/0.log" Apr 17 00:01:03.952724 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:03.952671 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg_67a674cd-ba01-41f4-877c-55adb3dd845c/util/0.log" Apr 17 00:01:03.972888 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:03.972805 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef15zkdg_67a674cd-ba01-41f4-877c-55adb3dd845c/pull/0.log" Apr 17 00:01:04.284488 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:04.284349 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5565f6cccf-qwgk4_a09c1586-0cb9-49ea-9f41-2b4831e94d6f/authorino/0.log" Apr 17 00:01:04.358633 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:04.358596 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-klfkd_7d2396f2-e57a-46cb-a71a-de7349961bbd/kuadrant-console-plugin/0.log" Apr 17 00:01:04.475186 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:04.475158 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-r948t_e60c229e-2b91-44f9-990e-ed9f6138ff01/limitador/0.log" Apr 17 00:01:06.013235 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:06.013197 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-t99s8_5d9ce457-6a1e-4495-8955-15be5d126952/kube-state-metrics/0.log" Apr 17 00:01:06.033768 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:06.033740 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-t99s8_5d9ce457-6a1e-4495-8955-15be5d126952/kube-rbac-proxy-main/0.log" Apr 17 00:01:06.053797 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:06.053768 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-t99s8_5d9ce457-6a1e-4495-8955-15be5d126952/kube-rbac-proxy-self/0.log" Apr 17 00:01:06.082074 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:06.082023 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6db978cd86-8m9t5_b68074bf-4c69-4374-ad15-d584ea87107b/metrics-server/0.log" Apr 17 00:01:06.282362 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:06.282269 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f47hd_52c47db8-79a8-405f-b69b-00344f1bd3b2/node-exporter/0.log" Apr 17 00:01:06.303514 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:06.303484 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f47hd_52c47db8-79a8-405f-b69b-00344f1bd3b2/kube-rbac-proxy/0.log" Apr 17 00:01:06.331588 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:06.331557 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-f47hd_52c47db8-79a8-405f-b69b-00344f1bd3b2/init-textfile/0.log" Apr 17 00:01:06.356583 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:06.356545 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7mr58_9588a074-c48c-4252-990c-1585ba91f39e/kube-rbac-proxy-main/0.log" Apr 17 00:01:06.375722 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:06.375674 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7mr58_9588a074-c48c-4252-990c-1585ba91f39e/kube-rbac-proxy-self/0.log" Apr 17 00:01:06.395201 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:06.395173 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7mr58_9588a074-c48c-4252-990c-1585ba91f39e/openshift-state-metrics/0.log" Apr 17 00:01:06.589472 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:06.589400 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-bzvl5_44ca12aa-7211-4889-91a2-95139cba7d7d/prometheus-operator/0.log" Apr 17 00:01:06.605599 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:06.605561 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-bzvl5_44ca12aa-7211-4889-91a2-95139cba7d7d/kube-rbac-proxy/0.log" Apr 17 00:01:06.661310 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:06.661279 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-89fc5cd9-nnkdg_a160bce0-fcf3-4730-bdeb-d185a0828bd4/telemeter-client/0.log" Apr 17 00:01:06.680612 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:06.680557 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-89fc5cd9-nnkdg_a160bce0-fcf3-4730-bdeb-d185a0828bd4/reload/0.log" Apr 17 00:01:06.700629 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:06.700595 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-89fc5cd9-nnkdg_a160bce0-fcf3-4730-bdeb-d185a0828bd4/kube-rbac-proxy/0.log" Apr 17 00:01:08.300878 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:08.300841 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/2.log" Apr 17 00:01:08.305347 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:08.305318 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ldlc_ed5b2be5-9e2a-419f-989a-30f08a0e3d57/console-operator/3.log" Apr 17 00:01:08.738000 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:08.737973 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79d7cf776-sspzd_b200247f-e2ef-48e2-95a2-daad0e34e8f7/console/0.log" Apr 17 00:01:08.769383 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:08.769350 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-lq92s_9d0463f5-8d66-479e-a0dc-a13347a782d2/download-server/0.log" Apr 17 00:01:08.883713 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:08.883663 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc"] Apr 17 00:01:08.889540 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:08.889511 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:08.897533 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:08.897503 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc"] Apr 17 00:01:09.027197 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.026858 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/940886d7-ef4a-43c8-abb3-222756b0e2e9-podres\") pod \"perf-node-gather-daemonset-kknhc\" (UID: \"940886d7-ef4a-43c8-abb3-222756b0e2e9\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.027197 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.026930 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/940886d7-ef4a-43c8-abb3-222756b0e2e9-sys\") pod \"perf-node-gather-daemonset-kknhc\" (UID: \"940886d7-ef4a-43c8-abb3-222756b0e2e9\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.027197 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.026959 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/940886d7-ef4a-43c8-abb3-222756b0e2e9-proc\") pod \"perf-node-gather-daemonset-kknhc\" (UID: \"940886d7-ef4a-43c8-abb3-222756b0e2e9\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.027197 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.026981 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/940886d7-ef4a-43c8-abb3-222756b0e2e9-lib-modules\") pod \"perf-node-gather-daemonset-kknhc\" (UID: \"940886d7-ef4a-43c8-abb3-222756b0e2e9\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.027197 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.027027 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6zls\" (UniqueName: \"kubernetes.io/projected/940886d7-ef4a-43c8-abb3-222756b0e2e9-kube-api-access-w6zls\") pod \"perf-node-gather-daemonset-kknhc\" (UID: \"940886d7-ef4a-43c8-abb3-222756b0e2e9\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.128340 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.128295 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6zls\" (UniqueName: \"kubernetes.io/projected/940886d7-ef4a-43c8-abb3-222756b0e2e9-kube-api-access-w6zls\") pod \"perf-node-gather-daemonset-kknhc\" (UID: \"940886d7-ef4a-43c8-abb3-222756b0e2e9\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.128601 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.128397 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/940886d7-ef4a-43c8-abb3-222756b0e2e9-podres\") pod \"perf-node-gather-daemonset-kknhc\" (UID: \"940886d7-ef4a-43c8-abb3-222756b0e2e9\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.128601 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.128450 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/940886d7-ef4a-43c8-abb3-222756b0e2e9-sys\") pod \"perf-node-gather-daemonset-kknhc\" (UID: \"940886d7-ef4a-43c8-abb3-222756b0e2e9\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.128601 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.128473 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/940886d7-ef4a-43c8-abb3-222756b0e2e9-proc\") pod \"perf-node-gather-daemonset-kknhc\" (UID: \"940886d7-ef4a-43c8-abb3-222756b0e2e9\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.128601 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.128489 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/940886d7-ef4a-43c8-abb3-222756b0e2e9-lib-modules\") pod \"perf-node-gather-daemonset-kknhc\" (UID: \"940886d7-ef4a-43c8-abb3-222756b0e2e9\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.128601 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.128573 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/940886d7-ef4a-43c8-abb3-222756b0e2e9-sys\") pod \"perf-node-gather-daemonset-kknhc\" (UID: \"940886d7-ef4a-43c8-abb3-222756b0e2e9\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.128952 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.128634 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/940886d7-ef4a-43c8-abb3-222756b0e2e9-proc\") pod \"perf-node-gather-daemonset-kknhc\" (UID: \"940886d7-ef4a-43c8-abb3-222756b0e2e9\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.128952 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.128640 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/940886d7-ef4a-43c8-abb3-222756b0e2e9-lib-modules\") pod \"perf-node-gather-daemonset-kknhc\" (UID: \"940886d7-ef4a-43c8-abb3-222756b0e2e9\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.128952 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.128648 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/940886d7-ef4a-43c8-abb3-222756b0e2e9-podres\") pod \"perf-node-gather-daemonset-kknhc\" (UID: \"940886d7-ef4a-43c8-abb3-222756b0e2e9\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.136978 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.136949 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6zls\" (UniqueName: \"kubernetes.io/projected/940886d7-ef4a-43c8-abb3-222756b0e2e9-kube-api-access-w6zls\") pod \"perf-node-gather-daemonset-kknhc\" (UID: \"940886d7-ef4a-43c8-abb3-222756b0e2e9\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.204766 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.204272 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:09.360738 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:09.360683 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc"] Apr 17 00:01:10.075325 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:10.075283 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" event={"ID":"940886d7-ef4a-43c8-abb3-222756b0e2e9","Type":"ContainerStarted","Data":"8a0d21269938d57e0a64771c0252216dfe21b0daf346b2e8576dae8b1027e44a"} Apr 17 00:01:10.075325 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:10.075331 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" event={"ID":"940886d7-ef4a-43c8-abb3-222756b0e2e9","Type":"ContainerStarted","Data":"bd354a6e812911f63ee752f96147014958167bdbb6054e6b3a3bb012751dbcab"} Apr 17 00:01:10.075727 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:10.075352 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:10.090541 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:10.090480 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" podStartSLOduration=2.090460412 podStartE2EDuration="2.090460412s" podCreationTimestamp="2026-04-17 00:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 00:01:10.088057646 +0000 UTC m=+2110.704089151" watchObservedRunningTime="2026-04-17 00:01:10.090460412 +0000 UTC m=+2110.706491891" Apr 17 00:01:10.154983 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:10.154960 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fr2wb_9efb52c5-c96d-422a-8c15-e03f71fdd622/dns/0.log" Apr 17 00:01:10.176299 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:10.176276 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fr2wb_9efb52c5-c96d-422a-8c15-e03f71fdd622/kube-rbac-proxy/0.log" Apr 17 00:01:10.196631 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:10.196603 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4kt5z_5aea2741-baa5-486d-8a0f-9eef53a7f27a/dns-node-resolver/0.log" Apr 17 00:01:10.739781 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:10.739746 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dg2kq_fa4aee53-dd23-4d2a-9b51-9a0d0822c22a/node-ca/0.log" Apr 17 00:01:11.692219 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:11.692192 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-577868f455-sqcr4_4767bdaf-3083-4960-9ed8-e4ad26613b2d/kube-auth-proxy/0.log" Apr 17 00:01:12.271385 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:12.271320 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-cwjtr_3dc4e703-91ac-44ae-9d1a-83214f2378fd/serve-healthcheck-canary/0.log" Apr 17 00:01:12.731592 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:12.731565 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-69rmt_a5316185-790a-4a40-b230-e6cc6cc0b80b/insights-operator/0.log" Apr 17 00:01:12.733757 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:12.733732 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-69rmt_a5316185-790a-4a40-b230-e6cc6cc0b80b/insights-operator/1.log" Apr 17 00:01:12.821714 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:12.821673 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n9vr7_03a8050d-e2eb-4c21-91c0-a18c9baefeca/kube-rbac-proxy/0.log" Apr 17 00:01:12.842858 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:12.842830 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n9vr7_03a8050d-e2eb-4c21-91c0-a18c9baefeca/exporter/0.log" Apr 17 00:01:12.864840 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:12.864815 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n9vr7_03a8050d-e2eb-4c21-91c0-a18c9baefeca/extractor/0.log" Apr 17 00:01:14.943689 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:14.943654 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-8bf69b96d-9t79m_c684f7a7-cc33-43fd-993c-56565eeab0ca/manager/0.log" Apr 17 00:01:16.085547 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:16.085521 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-86bf875fd5-d558z_ee2cd232-bc5b-489a-a512-2db820b1a069/manager/0.log" Apr 17 00:01:16.091079 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:16.091054 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-kknhc" Apr 17 00:01:20.470647 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:20.470617 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-4ljv8_ea4856cb-ab54-40ff-8382-60807f91deea/migrator/0.log" Apr 17 00:01:20.492949 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:20.492920 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-4ljv8_ea4856cb-ab54-40ff-8382-60807f91deea/graceful-termination/0.log" Apr 17 00:01:20.884110 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:20.884074 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-67vgr_2665ad6e-102c-40fd-8fac-4e7fdd52738a/kube-storage-version-migrator-operator/1.log" Apr 17 00:01:20.885012 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:20.884991 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-67vgr_2665ad6e-102c-40fd-8fac-4e7fdd52738a/kube-storage-version-migrator-operator/0.log" Apr 17 00:01:22.076526 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:22.076495 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wnp9b_5c265860-ea8b-4315-acf6-bbb9ee728fe8/kube-multus-additional-cni-plugins/0.log" Apr 17 00:01:22.095599 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:22.095568 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wnp9b_5c265860-ea8b-4315-acf6-bbb9ee728fe8/egress-router-binary-copy/0.log" Apr 17 00:01:22.115097 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:22.115074 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wnp9b_5c265860-ea8b-4315-acf6-bbb9ee728fe8/cni-plugins/0.log" Apr 17 00:01:22.136664 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:22.136629 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wnp9b_5c265860-ea8b-4315-acf6-bbb9ee728fe8/bond-cni-plugin/0.log" Apr 17 00:01:22.158503 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:22.158477 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wnp9b_5c265860-ea8b-4315-acf6-bbb9ee728fe8/routeoverride-cni/0.log" Apr 17 00:01:22.180508 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:22.180481 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wnp9b_5c265860-ea8b-4315-acf6-bbb9ee728fe8/whereabouts-cni-bincopy/0.log" Apr 17 00:01:22.199052 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:22.199025 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wnp9b_5c265860-ea8b-4315-acf6-bbb9ee728fe8/whereabouts-cni/0.log" Apr 17 00:01:22.375880 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:22.375794 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mwc9h_80d77b0e-6b2a-4741-be2b-8b95c72b915e/kube-multus/0.log" Apr 17 00:01:22.530079 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:22.530043 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zp26z_e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2/network-metrics-daemon/0.log" Apr 17 00:01:22.548660 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:22.548637 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zp26z_e3ac28c9-9db2-43a5-bb52-bc7930fe2ab2/kube-rbac-proxy/0.log" Apr 17 00:01:23.844044 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:23.844016 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-controller/0.log" Apr 17 00:01:23.859973 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:23.859943 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/0.log" Apr 17 00:01:23.869922 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:23.869889 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovn-acl-logging/1.log" Apr 17 00:01:23.889116 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:23.889087 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/kube-rbac-proxy-node/0.log" Apr 17 00:01:23.909102 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:23.909079 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 00:01:23.925615 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:23.925592 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/northd/0.log" Apr 17 00:01:23.945338 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:23.945315 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/nbdb/0.log" Apr 17 00:01:23.967718 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:23.967672 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/sbdb/0.log" Apr 17 00:01:24.078371 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:24.078334 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md28x_4457788c-bfe7-45d0-9674-8966cbeef7a6/ovnkube-controller/0.log" Apr 17 00:01:25.140835 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:25.140806 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-gwhc8_d2fcd77b-9751-44d5-a36b-64111dfec87c/check-endpoints/0.log" Apr 17 00:01:25.184325 ip-10-0-131-43 kubenswrapper[2564]: I0417 00:01:25.184292 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rl6cs_a35b971f-784e-46e9-b251-dbb9a720c52f/network-check-target-container/0.log"