Apr 20 21:44:37.775759 ip-10-0-137-199 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 21:44:37.775771 ip-10-0-137-199 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 21:44:37.775778 ip-10-0-137-199 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 21:44:37.775993 ip-10-0-137-199 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 21:44:48.025101 ip-10-0-137-199 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 21:44:48.025120 ip-10-0-137-199 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot c75544b4d2a243ea828af21850171281 -- Apr 20 21:47:11.063550 ip-10-0-137-199 systemd[1]: Starting Kubernetes Kubelet... Apr 20 21:47:11.520549 ip-10-0-137-199 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 21:47:11.520549 ip-10-0-137-199 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 21:47:11.520549 ip-10-0-137-199 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 21:47:11.520549 ip-10-0-137-199 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 21:47:11.520549 ip-10-0-137-199 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 21:47:11.523474 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.523388 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 21:47:11.529600 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529585 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:47:11.529600 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529600 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529604 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529607 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529610 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529613 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529616 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529619 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529622 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529625 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529627 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529630 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529632 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529635 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529638 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529641 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529643 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529646 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529649 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529652 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529655 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:47:11.529669 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529657 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529660 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529662 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529665 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529667 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529670 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529673 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529675 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529678 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529681 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529684 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529686 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529689 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529692 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529694 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529697 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529699 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529702 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529704 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529707 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:47:11.530142 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529709 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529712 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529714 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529717 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529719 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529722 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529724 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529726 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529729 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529732 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529735 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529737 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529739 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529742 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529745 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529748 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529751 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529754 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529756 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529759 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:47:11.530702 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529761 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529764 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529766 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529769 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529771 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529774 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529777 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529779 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529782 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529784 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529787 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529789 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529792 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529795 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529797 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529801 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529806 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529809 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529813 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:47:11.531193 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529817 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529819 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529822 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529825 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529828 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.529831 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530202 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530208 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530211 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530213 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530216 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530218 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530221 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530224 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530226 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530229 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530231 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530234 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530236 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530239 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:47:11.531663 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530241 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530244 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530246 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530249 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530251 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530254 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530256 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530259 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530263 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530266 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530269 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530272 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530274 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530295 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530299 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530303 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530308 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530312 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530315 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:47:11.532143 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530319 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530322 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530326 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530330 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530333 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530336 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530338 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530341 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530345 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530349 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530351 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530354 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530357 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530359 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530362 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530364 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530367 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530369 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530372 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:47:11.532627 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530374 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530377 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530379 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530382 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530384 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530386 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530389 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530392 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530394 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530397 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530400 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530403 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530406 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530409 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530411 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530414 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530416 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530419 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530421 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530424 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:47:11.533084 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530426 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530429 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530431 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530434 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530436 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530439 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530441 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530443 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530446 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530448 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530451 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530453 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530456 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.530459 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531208 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531217 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531229 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531233 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531237 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531241 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531245 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 21:47:11.533610 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531250 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531254 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531257 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531261 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531265 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531268 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531271 2566 flags.go:64] FLAG: --cgroup-root="" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531274 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531291 2566 flags.go:64] FLAG: --client-ca-file="" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531294 2566 flags.go:64] FLAG: --cloud-config="" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531297 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531300 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531304 2566 flags.go:64] FLAG: --cluster-domain="" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531308 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531311 2566 flags.go:64] FLAG: --config-dir="" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531314 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531317 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531321 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531324 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531327 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531331 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531334 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531337 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531340 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531343 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 21:47:11.534143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531346 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531350 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531353 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531356 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531359 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531362 2566 flags.go:64] FLAG: --enable-server="true" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531365 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531369 2566 flags.go:64] FLAG: --event-burst="100" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531372 2566 flags.go:64] FLAG: --event-qps="50" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531375 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531378 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531382 2566 flags.go:64] FLAG: --eviction-hard="" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531386 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531389 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531392 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531395 2566 flags.go:64] FLAG: --eviction-soft="" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531398 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531400 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531403 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531406 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531409 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531412 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531414 2566 flags.go:64] FLAG: --feature-gates="" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531418 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531421 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 21:47:11.534753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531424 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531427 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531433 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531436 2566 flags.go:64] FLAG: --help="false" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531439 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-137-199.ec2.internal" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531442 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531445 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531448 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531451 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531454 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531459 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531462 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531464 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531467 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531470 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531473 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531476 2566 flags.go:64] FLAG: --kube-reserved="" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531478 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531481 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531484 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531487 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531490 2566 flags.go:64] FLAG: --lock-file="" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531493 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531496 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 21:47:11.535386 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531499 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531504 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531507 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531509 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531512 2566 flags.go:64] FLAG: --logging-format="text" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531515 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531518 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531521 2566 flags.go:64] FLAG: --manifest-url="" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531524 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531528 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531532 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531536 2566 flags.go:64] FLAG: --max-pods="110" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531539 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531542 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531545 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531548 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531550 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531553 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531561 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531568 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531571 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531573 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531576 2566 flags.go:64] FLAG: --pod-cidr="" Apr 20 21:47:11.536008 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531579 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531584 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531587 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531590 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531593 2566 flags.go:64] FLAG: --port="10250" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531596 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531599 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f284758eb070bef5" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531603 2566 flags.go:64] FLAG: --qos-reserved="" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531605 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531608 2566 flags.go:64] FLAG: --register-node="true" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531611 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531614 2566 flags.go:64] FLAG: --register-with-taints="" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531618 2566 flags.go:64] FLAG: --registry-burst="10" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531620 2566 flags.go:64] FLAG: --registry-qps="5" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531623 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531626 2566 flags.go:64] FLAG: --reserved-memory="" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531630 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531632 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531635 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531638 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531642 2566 flags.go:64] FLAG: --runonce="false" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531645 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531648 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531651 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531654 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531657 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 21:47:11.536579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531660 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531664 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531667 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531670 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531673 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531675 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531678 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531681 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531684 2566 flags.go:64] FLAG: --system-cgroups="" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531687 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531692 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531695 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531698 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531702 2566 flags.go:64] FLAG: --tls-min-version="" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531705 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531707 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531710 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531713 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531716 2566 flags.go:64] FLAG: --v="2" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531720 2566 flags.go:64] FLAG: --version="false" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531724 2566 flags.go:64] FLAG: --vmodule="" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531728 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.531731 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531846 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531850 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:47:11.537232 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531853 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531858 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531860 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531863 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531867 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531871 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531874 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531876 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531880 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531883 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531885 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531888 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531891 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531893 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531896 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531898 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531901 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531903 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531906 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:47:11.537830 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531909 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531911 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531914 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531917 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531920 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531922 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531925 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531928 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531930 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531933 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531935 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531937 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531940 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531943 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531947 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531949 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531952 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531954 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531957 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531959 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:47:11.538324 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531962 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531965 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531968 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531970 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531973 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531975 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531978 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531981 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531983 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531986 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531988 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531991 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531993 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531996 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.531999 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532001 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532004 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532006 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532009 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532011 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:47:11.538857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532014 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532016 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532019 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532021 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532024 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532026 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532030 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532033 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532035 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532038 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532042 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532045 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532049 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532053 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532056 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532058 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532061 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532064 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532066 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:47:11.539412 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532069 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:47:11.539958 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532072 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:47:11.539958 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532075 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:47:11.539958 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532077 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:47:11.539958 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532080 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:47:11.539958 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.532082 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:47:11.539958 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.533304 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 21:47:11.543337 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.543316 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 21:47:11.543337 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.543334 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543381 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543386 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543389 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543392 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543396 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543398 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543401 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543404 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543407 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543411 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543414 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543418 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543420 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543423 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543426 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543428 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543431 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543434 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:47:11.543461 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543436 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543439 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543441 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543446 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543450 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543453 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543456 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543459 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543462 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543465 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543468 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543471 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543474 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543477 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543480 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543482 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543485 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543488 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543490 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543493 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:47:11.543968 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543495 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543498 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543500 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543503 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543505 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543508 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543510 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543513 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543516 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543519 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543521 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543524 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543526 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543529 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543532 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543535 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543537 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543540 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543543 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543545 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:47:11.544470 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543548 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543551 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543554 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543556 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543559 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543561 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543564 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543566 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543569 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543571 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543574 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543577 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543579 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543582 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543584 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543587 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543589 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543592 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543595 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543598 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:47:11.544950 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543601 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543603 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543606 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543609 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543611 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543614 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543617 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543620 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.543625 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543737 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543742 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543745 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543748 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543751 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543754 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543757 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:47:11.545457 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543759 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543762 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543765 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543768 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543770 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543773 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543775 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543778 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543780 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543783 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543785 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543788 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543791 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543793 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543796 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543810 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543814 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543818 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543821 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:47:11.545857 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543824 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543826 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543829 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543832 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543835 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543838 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543841 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543843 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543847 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543849 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543852 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543854 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543857 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543859 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543862 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543864 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543867 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543869 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543872 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:47:11.546342 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543874 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543877 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543879 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543882 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543884 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543887 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543890 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543892 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543895 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543897 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543901 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543903 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543906 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543908 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543911 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543913 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543916 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543919 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543922 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543925 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:47:11.546803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543927 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543930 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543933 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543935 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543938 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543940 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543943 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543945 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543948 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543950 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543953 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543957 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543960 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543962 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543965 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543968 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543970 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543973 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543975 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543978 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:47:11.547423 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:11.543981 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:47:11.547904 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.543985 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 21:47:11.547904 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.544082 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 21:47:11.547904 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.546595 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 21:47:11.547904 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.547516 2566 server.go:1019] "Starting client certificate rotation" Apr 20 21:47:11.547904 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.547613 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 21:47:11.547904 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.547660 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 21:47:11.575619 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.575601 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 21:47:11.578114 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.578101 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 21:47:11.595644 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.595623 2566 log.go:25] "Validated CRI v1 runtime API" Apr 20 21:47:11.601981 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.601963 2566 log.go:25] "Validated CRI v1 image API" Apr 20 21:47:11.604864 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.604839 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 21:47:11.605855 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.605838 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 21:47:11.608118 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.608100 2566 fs.go:135] Filesystem UUIDs: map[6bb2f77d-813e-46b6-a178-e2ee48b685cb:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 ff949e12-d246-456c-89fd-b5fa570215b7:/dev/nvme0n1p3] Apr 20 21:47:11.608162 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.608119 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 21:47:11.614027 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.613925 2566 manager.go:217] Machine: {Timestamp:2026-04-20 21:47:11.611773329 +0000 UTC m=+0.420460685 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099928 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2bc68bb8abdc95bcdb735866798dd1 SystemUUID:ec2bc68b-b8ab-dc95-bcdb-735866798dd1 BootID:c75544b4-d2a2-43ea-828a-f21850171281 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8f:88:21:9f:69 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8f:88:21:9f:69 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3a:0f:c7:6f:78:96 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 21:47:11.614027 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.614022 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 21:47:11.614133 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.614104 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 21:47:11.616824 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.616778 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 21:47:11.616962 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.616827 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-199.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 21:47:11.617009 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.616972 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 21:47:11.617009 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.616980 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 21:47:11.617009 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.616993 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 21:47:11.617009 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.617003 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 21:47:11.618321 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.618311 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 20 21:47:11.618461 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.618452 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 21:47:11.620892 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.620878 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7hxdl" Apr 20 21:47:11.621392 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.621382 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 20 21:47:11.621422 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.621397 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 21:47:11.621422 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.621409 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 21:47:11.621422 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.621421 2566 kubelet.go:397] "Adding apiserver pod source" Apr 20 21:47:11.621529 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.621429 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 21:47:11.622459 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.622448 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 21:47:11.622496 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.622468 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 21:47:11.625469 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.625450 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 21:47:11.626393 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.626378 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7hxdl" Apr 20 21:47:11.626817 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.626804 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 21:47:11.628776 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.628762 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 21:47:11.628838 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.628785 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 21:47:11.628838 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.628795 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 21:47:11.628838 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.628812 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 21:47:11.628838 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.628821 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 21:47:11.628838 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.628830 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 21:47:11.628838 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.628839 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 21:47:11.628996 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.628848 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 21:47:11.628996 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.628857 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 21:47:11.628996 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.628867 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 21:47:11.628996 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.628879 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 21:47:11.628996 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.628890 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 21:47:11.631615 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.631603 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 21:47:11.631615 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.631616 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 21:47:11.633500 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.633483 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:47:11.635179 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.635167 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 21:47:11.635220 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.635202 2566 server.go:1295] "Started kubelet" Apr 20 21:47:11.637368 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.637335 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:47:11.637910 ip-10-0-137-199 systemd[1]: Started Kubernetes Kubelet. Apr 20 21:47:11.638591 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.638555 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 21:47:11.638991 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.638941 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 21:47:11.639063 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.639004 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 21:47:11.639156 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.639138 2566 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-199.ec2.internal" not found Apr 20 21:47:11.640256 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.640241 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 21:47:11.641461 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.641446 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 20 21:47:11.645809 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.645787 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 21:47:11.645876 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.645798 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 21:47:11.647819 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.647799 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 21:47:11.647939 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.647912 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 21:47:11.648190 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:11.647844 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-199.ec2.internal\" not found" Apr 20 21:47:11.648299 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.647981 2566 factory.go:55] Registering systemd factory Apr 20 21:47:11.648356 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.648305 2566 factory.go:223] Registration of the systemd container factory successfully Apr 20 21:47:11.648356 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.648240 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 20 21:47:11.648435 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.648362 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 20 21:47:11.648435 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.647827 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 21:47:11.648527 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.648513 2566 factory.go:153] Registering CRI-O factory Apr 20 21:47:11.648527 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.648526 2566 factory.go:223] Registration of the crio container factory successfully Apr 20 21:47:11.648630 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.648575 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 21:47:11.648630 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.648595 2566 factory.go:103] Registering Raw factory Apr 20 21:47:11.648630 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.648605 2566 manager.go:1196] Started watching for new ooms in manager Apr 20 21:47:11.649498 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.649476 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:47:11.649747 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.649724 2566 manager.go:319] Starting recovery of all containers Apr 20 21:47:11.650632 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:11.650608 2566 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 21:47:11.652715 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:11.652661 2566 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-199.ec2.internal\" not found" node="ip-10-0-137-199.ec2.internal" Apr 20 21:47:11.653851 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.653828 2566 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-199.ec2.internal" not found Apr 20 21:47:11.659225 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.659208 2566 manager.go:324] Recovery completed Apr 20 21:47:11.664629 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.664529 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:47:11.667253 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.667237 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-199.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:47:11.667336 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.667264 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:47:11.667336 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.667275 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-199.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:47:11.667738 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.667723 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 21:47:11.667738 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.667736 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 21:47:11.667810 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.667753 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 20 21:47:11.669998 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.669979 2566 policy_none.go:49] "None policy: Start" Apr 20 21:47:11.669998 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.669998 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 21:47:11.670136 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.670011 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 20 21:47:11.712845 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.712830 2566 manager.go:341] "Starting Device Plugin manager" Apr 20 21:47:11.712956 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:11.712857 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 21:47:11.712956 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.712867 2566 server.go:85] "Starting device plugin registration server" Apr 20 21:47:11.713072 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.713057 2566 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-199.ec2.internal" not found Apr 20 21:47:11.713134 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.713107 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 21:47:11.713176 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.713120 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 21:47:11.713686 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.713668 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 21:47:11.713779 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.713745 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 21:47:11.713779 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.713754 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 21:47:11.715598 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:11.714097 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 21:47:11.715598 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:11.714150 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-199.ec2.internal\" not found" Apr 20 21:47:11.767666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.767637 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 21:47:11.768857 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.768837 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 21:47:11.768948 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.768862 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 21:47:11.768948 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.768877 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 21:47:11.768948 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.768884 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 21:47:11.768948 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:11.768915 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 21:47:11.771408 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.771364 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:47:11.813741 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.813725 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:47:11.814529 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.814506 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-199.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:47:11.814603 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.814532 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:47:11.814603 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.814543 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-199.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:47:11.814603 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.814565 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-199.ec2.internal" Apr 20 21:47:11.824012 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.823993 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-199.ec2.internal" Apr 20 21:47:11.869870 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.869837 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-199.ec2.internal"] Apr 20 21:47:11.872044 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.872025 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-199.ec2.internal" Apr 20 21:47:11.872044 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.872034 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal" Apr 20 21:47:11.894787 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.894770 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-199.ec2.internal" Apr 20 21:47:11.899131 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.899115 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal" Apr 20 21:47:11.910875 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.910854 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 21:47:11.910941 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:11.910856 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 21:47:12.049450 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.049393 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/aeca6dfa04aa73a6b44d98f09ffc3dbc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal\" (UID: \"aeca6dfa04aa73a6b44d98f09ffc3dbc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal" Apr 20 21:47:12.049450 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.049421 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aeca6dfa04aa73a6b44d98f09ffc3dbc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal\" (UID: \"aeca6dfa04aa73a6b44d98f09ffc3dbc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal" Apr 20 21:47:12.049450 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.049439 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/22aaf800e14f7250fca48df51b47e1cc-config\") pod \"kube-apiserver-proxy-ip-10-0-137-199.ec2.internal\" (UID: \"22aaf800e14f7250fca48df51b47e1cc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-199.ec2.internal" Apr 20 21:47:12.150248 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.150223 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/aeca6dfa04aa73a6b44d98f09ffc3dbc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal\" (UID: \"aeca6dfa04aa73a6b44d98f09ffc3dbc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal" Apr 20 21:47:12.150362 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.150250 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aeca6dfa04aa73a6b44d98f09ffc3dbc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal\" (UID: \"aeca6dfa04aa73a6b44d98f09ffc3dbc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal" Apr 20 21:47:12.150362 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.150267 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/22aaf800e14f7250fca48df51b47e1cc-config\") pod \"kube-apiserver-proxy-ip-10-0-137-199.ec2.internal\" (UID: \"22aaf800e14f7250fca48df51b47e1cc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-199.ec2.internal" Apr 20 21:47:12.150362 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.150336 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aeca6dfa04aa73a6b44d98f09ffc3dbc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal\" (UID: \"aeca6dfa04aa73a6b44d98f09ffc3dbc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal" Apr 20 21:47:12.150456 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.150387 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/22aaf800e14f7250fca48df51b47e1cc-config\") pod \"kube-apiserver-proxy-ip-10-0-137-199.ec2.internal\" (UID: \"22aaf800e14f7250fca48df51b47e1cc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-199.ec2.internal" Apr 20 21:47:12.150456 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.150414 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/aeca6dfa04aa73a6b44d98f09ffc3dbc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal\" (UID: \"aeca6dfa04aa73a6b44d98f09ffc3dbc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal" Apr 20 21:47:12.214398 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.214363 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal" Apr 20 21:47:12.215462 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.215446 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-199.ec2.internal" Apr 20 21:47:12.547441 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.547361 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 21:47:12.547897 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.547513 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 21:47:12.547897 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.547514 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 21:47:12.547897 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.547515 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 21:47:12.621748 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.621722 2566 apiserver.go:52] "Watching apiserver" Apr 20 21:47:12.628329 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.628244 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 21:42:11 +0000 UTC" deadline="2027-11-13 07:24:20.780717352 +0000 UTC" Apr 20 21:47:12.628329 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.628323 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13713h37m8.152397565s" Apr 20 21:47:12.629804 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.629784 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 21:47:12.631190 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.631170 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-j8c9k","openshift-network-diagnostics/network-check-target-pt7fs","openshift-network-operator/iptables-alerter-t8r9c","kube-system/kube-apiserver-proxy-ip-10-0-137-199.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf","openshift-cluster-node-tuning-operator/tuned-jrqrk","openshift-dns/node-resolver-t72lj","openshift-ovn-kubernetes/ovnkube-node-jvkzb","kube-system/konnectivity-agent-ljbjz","openshift-image-registry/node-ca-8b2mt","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal","openshift-multus/multus-additional-cni-plugins-f9k89","openshift-multus/multus-swrdw"] Apr 20 21:47:12.633528 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.633509 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:12.633614 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:12.633593 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:12.634619 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.634601 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:12.634680 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:12.634661 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:12.635518 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.635499 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t8r9c" Apr 20 21:47:12.636478 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.636455 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.637497 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.637471 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.637601 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.637581 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t72lj" Apr 20 21:47:12.638414 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.638269 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:47:12.638414 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.638366 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-22dw7\"" Apr 20 21:47:12.638561 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.638541 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 21:47:12.638674 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.638658 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 21:47:12.639089 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.639065 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 21:47:12.639221 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.639204 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bblwp\"" Apr 20 21:47:12.639595 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.639413 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 21:47:12.639595 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.639478 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 21:47:12.640172 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.639798 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:47:12.640172 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.639846 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.640172 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.640009 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-r6jqw\"" Apr 20 21:47:12.640619 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.640570 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 21:47:12.640731 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.640697 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sdzrt\"" Apr 20 21:47:12.640839 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.640821 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 21:47:12.640999 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.640939 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 21:47:12.641834 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.641818 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ljbjz" Apr 20 21:47:12.642563 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.642548 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 21:47:12.642652 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.642590 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 21:47:12.643148 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.643133 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8b2mt" Apr 20 21:47:12.643752 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.643580 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 21:47:12.643752 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.643621 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 21:47:12.643752 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.643631 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 21:47:12.643752 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.643621 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 21:47:12.643752 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.643586 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-wj45z\"" Apr 20 21:47:12.644040 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.643973 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 21:47:12.644085 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.644050 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jjmbf\"" Apr 20 21:47:12.644175 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.644156 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 21:47:12.644388 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.644372 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.645997 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.645573 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 21:47:12.645997 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.645754 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4nmcg\"" Apr 20 21:47:12.645997 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.645887 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 21:47:12.646177 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.646150 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.646359 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.646339 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 21:47:12.646632 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.646545 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 21:47:12.646719 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.646677 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 21:47:12.646791 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.646779 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 21:47:12.646978 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.646953 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 21:47:12.647361 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.647074 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 21:47:12.647361 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.647088 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-twf4d\"" Apr 20 21:47:12.647361 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.647143 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 21:47:12.648492 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.648476 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-j7kzm\"" Apr 20 21:47:12.648724 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.648695 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 21:47:12.649471 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.649455 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 21:47:12.652665 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652646 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-run-netns\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.652749 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652673 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs\") pod \"network-metrics-daemon-j8c9k\" (UID: \"86a4e942-ea9e-4978-b92e-c96688b972a3\") " pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:12.652749 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652691 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-kubernetes\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.652749 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652706 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b6c011cb-eacd-4c0a-88d4-f902e63941c3-system-cni-dir\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.652749 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652723 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-var-lib-openvswitch\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.652749 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652758 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-var-lib-cni-bin\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.652933 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652793 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-sys-fs\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.652933 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652810 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b6c011cb-eacd-4c0a-88d4-f902e63941c3-cni-binary-copy\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.652933 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652829 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjt5l\" (UniqueName: \"kubernetes.io/projected/b6c011cb-eacd-4c0a-88d4-f902e63941c3-kube-api-access-kjt5l\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.652933 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652850 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-slash\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.652933 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652865 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-node-log\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.652933 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652879 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/000763ee-232e-428c-84f2-4ca88f559d17-ovnkube-script-lib\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.652933 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652910 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkqkx\" (UniqueName: \"kubernetes.io/projected/a3b76653-340c-4572-9e21-939d7f3ef9ae-kube-api-access-zkqkx\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.652933 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652933 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-lib-modules\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.653184 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652953 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-kubelet\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.653184 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.652985 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-run-ovn-kubernetes\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.653184 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653015 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.653184 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653037 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55307413-a629-4893-b816-dd674a0d602f-cni-binary-copy\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.653184 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653064 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-run-k8s-cni-cncf-io\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.653184 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653080 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-registration-dir\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.653184 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653126 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-systemd\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.653184 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653152 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b6c011cb-eacd-4c0a-88d4-f902e63941c3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.653184 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653173 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b6c011cb-eacd-4c0a-88d4-f902e63941c3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653192 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-var-lib-cni-multus\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653207 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-etc-kubernetes\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653238 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-etc-selinux\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653258 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7201191d-577f-470c-81dc-ec7f86680c09-hosts-file\") pod \"node-resolver-t72lj\" (UID: \"7201191d-577f-470c-81dc-ec7f86680c09\") " pod="openshift-dns/node-resolver-t72lj" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653273 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/000763ee-232e-428c-84f2-4ca88f559d17-env-overrides\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653322 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-system-cni-dir\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653344 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-var-lib-kubelet\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653396 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f2edab0e-3795-4fcc-9d28-b2979d98277c-iptables-alerter-script\") pod \"iptables-alerter-t8r9c\" (UID: \"f2edab0e-3795-4fcc-9d28-b2979d98277c\") " pod="openshift-network-operator/iptables-alerter-t8r9c" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653411 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-device-dir\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653432 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-sysconfig\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653446 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-sysctl-conf\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653462 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-var-lib-kubelet\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653476 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-sys\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653494 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48jqx\" (UniqueName: \"kubernetes.io/projected/7201191d-577f-470c-81dc-ec7f86680c09-kube-api-access-48jqx\") pod \"node-resolver-t72lj\" (UID: \"7201191d-577f-470c-81dc-ec7f86680c09\") " pod="openshift-dns/node-resolver-t72lj" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653507 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-systemd-units\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.653520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653524 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-run-netns\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653538 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-run-ovn\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653560 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55307413-a629-4893-b816-dd674a0d602f-multus-daemon-config\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653586 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-run\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653604 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-cni-netd\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653622 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-multus-socket-dir-parent\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653642 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6csx\" (UniqueName: \"kubernetes.io/projected/55307413-a629-4893-b816-dd674a0d602f-kube-api-access-k6csx\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653681 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f2edab0e-3795-4fcc-9d28-b2979d98277c-host-slash\") pod \"iptables-alerter-t8r9c\" (UID: \"f2edab0e-3795-4fcc-9d28-b2979d98277c\") " pod="openshift-network-operator/iptables-alerter-t8r9c" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653718 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-socket-dir\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653749 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-modprobe-d\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653770 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b6c011cb-eacd-4c0a-88d4-f902e63941c3-cnibin\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653791 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b6c011cb-eacd-4c0a-88d4-f902e63941c3-os-release\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653812 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t88bl\" (UniqueName: \"kubernetes.io/projected/86a4e942-ea9e-4978-b92e-c96688b972a3-kube-api-access-t88bl\") pod \"network-metrics-daemon-j8c9k\" (UID: \"86a4e942-ea9e-4978-b92e-c96688b972a3\") " pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653826 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-tmp\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653841 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-log-socket\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653860 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-os-release\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653882 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-host\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.654112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653912 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4r7m\" (UniqueName: \"kubernetes.io/projected/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-kube-api-access-w4r7m\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653933 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b6c011cb-eacd-4c0a-88d4-f902e63941c3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653951 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/000763ee-232e-428c-84f2-4ca88f559d17-ovnkube-config\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653966 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468-serviceca\") pod \"node-ca-8b2mt\" (UID: \"ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468\") " pod="openshift-image-registry/node-ca-8b2mt" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653979 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-hostroot\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.653993 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8d78\" (UniqueName: \"kubernetes.io/projected/f2edab0e-3795-4fcc-9d28-b2979d98277c-kube-api-access-l8d78\") pod \"iptables-alerter-t8r9c\" (UID: \"f2edab0e-3795-4fcc-9d28-b2979d98277c\") " pod="openshift-network-operator/iptables-alerter-t8r9c" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654006 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654019 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-tuned\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654057 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-etc-openvswitch\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654092 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-run-openvswitch\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654117 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/000763ee-232e-428c-84f2-4ca88f559d17-ovn-node-metrics-cert\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654141 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mslv8\" (UniqueName: \"kubernetes.io/projected/000763ee-232e-428c-84f2-4ca88f559d17-kube-api-access-mslv8\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654166 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/39147d66-f51d-40db-a98e-ac955007f9af-konnectivity-ca\") pod \"konnectivity-agent-ljbjz\" (UID: \"39147d66-f51d-40db-a98e-ac955007f9af\") " pod="kube-system/konnectivity-agent-ljbjz" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654189 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-sysctl-d\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654211 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/39147d66-f51d-40db-a98e-ac955007f9af-agent-certs\") pod \"konnectivity-agent-ljbjz\" (UID: \"39147d66-f51d-40db-a98e-ac955007f9af\") " pod="kube-system/konnectivity-agent-ljbjz" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654232 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468-host\") pod \"node-ca-8b2mt\" (UID: \"ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468\") " pod="openshift-image-registry/node-ca-8b2mt" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654296 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-multus-cni-dir\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.654758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654322 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-cnibin\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.655216 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654345 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-multus-conf-dir\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.655216 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654371 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-run-multus-certs\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.655216 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654427 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54x8q\" (UniqueName: \"kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q\") pod \"network-check-target-pt7fs\" (UID: \"04d33160-cee6-4aaf-ab79-d806da372e92\") " pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:12.655216 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654456 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7201191d-577f-470c-81dc-ec7f86680c09-tmp-dir\") pod \"node-resolver-t72lj\" (UID: \"7201191d-577f-470c-81dc-ec7f86680c09\") " pod="openshift-dns/node-resolver-t72lj" Apr 20 21:47:12.655216 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654490 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-run-systemd\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.655216 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654509 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-cni-bin\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.655216 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.654523 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jch6\" (UniqueName: \"kubernetes.io/projected/ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468-kube-api-access-6jch6\") pod \"node-ca-8b2mt\" (UID: \"ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468\") " pod="openshift-image-registry/node-ca-8b2mt" Apr 20 21:47:12.655730 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.655714 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 21:47:12.675958 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.675942 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-629fr" Apr 20 21:47:12.683046 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.683025 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-629fr" Apr 20 21:47:12.699445 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:12.699417 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaeca6dfa04aa73a6b44d98f09ffc3dbc.slice/crio-5a94d034a1a2d9a77fce3d841a9790684906bd719a2018a14d76fc75f7b5fd40 WatchSource:0}: Error finding container 5a94d034a1a2d9a77fce3d841a9790684906bd719a2018a14d76fc75f7b5fd40: Status 404 returned error can't find the container with id 5a94d034a1a2d9a77fce3d841a9790684906bd719a2018a14d76fc75f7b5fd40 Apr 20 21:47:12.699829 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:12.699810 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22aaf800e14f7250fca48df51b47e1cc.slice/crio-fb3679f1deeebb58afff1d9898efb1fcfcdbc971cd7d17b753cc197dd7524d47 WatchSource:0}: Error finding container fb3679f1deeebb58afff1d9898efb1fcfcdbc971cd7d17b753cc197dd7524d47: Status 404 returned error can't find the container with id fb3679f1deeebb58afff1d9898efb1fcfcdbc971cd7d17b753cc197dd7524d47 Apr 20 21:47:12.704692 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.704678 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:47:12.755439 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755411 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-run-netns\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.755439 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755439 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs\") pod \"network-metrics-daemon-j8c9k\" (UID: \"86a4e942-ea9e-4978-b92e-c96688b972a3\") " pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:12.755582 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755455 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-kubernetes\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.755582 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755471 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b6c011cb-eacd-4c0a-88d4-f902e63941c3-system-cni-dir\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.755582 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755499 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-var-lib-openvswitch\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.755582 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755522 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-var-lib-cni-bin\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.755582 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755546 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-sys-fs\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.755582 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755557 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-kubernetes\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.755582 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755564 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-var-lib-openvswitch\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.755582 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755557 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b6c011cb-eacd-4c0a-88d4-f902e63941c3-system-cni-dir\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:12.755586 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755608 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-run-netns\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755608 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-sys-fs\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755588 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-var-lib-cni-bin\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:12.755656 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs podName:86a4e942-ea9e-4978-b92e-c96688b972a3 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:13.255634436 +0000 UTC m=+2.064321795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs") pod "network-metrics-daemon-j8c9k" (UID: "86a4e942-ea9e-4978-b92e-c96688b972a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755573 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b6c011cb-eacd-4c0a-88d4-f902e63941c3-cni-binary-copy\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755693 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjt5l\" (UniqueName: \"kubernetes.io/projected/b6c011cb-eacd-4c0a-88d4-f902e63941c3-kube-api-access-kjt5l\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755720 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-slash\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755744 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-node-log\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755769 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/000763ee-232e-428c-84f2-4ca88f559d17-ovnkube-script-lib\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755796 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-node-log\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755801 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-slash\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755796 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkqkx\" (UniqueName: \"kubernetes.io/projected/a3b76653-340c-4572-9e21-939d7f3ef9ae-kube-api-access-zkqkx\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755852 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-lib-modules\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755879 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-kubelet\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755907 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-run-ovn-kubernetes\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.755959 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755933 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755961 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55307413-a629-4893-b816-dd674a0d602f-cni-binary-copy\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755959 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-kubelet\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.755999 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-run-ovn-kubernetes\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756021 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756029 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-run-k8s-cni-cncf-io\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756033 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-lib-modules\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756061 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-registration-dir\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756068 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-run-k8s-cni-cncf-io\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756081 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-systemd\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756122 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-registration-dir\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756123 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-systemd\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756135 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b6c011cb-eacd-4c0a-88d4-f902e63941c3-cni-binary-copy\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756139 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b6c011cb-eacd-4c0a-88d4-f902e63941c3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756186 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b6c011cb-eacd-4c0a-88d4-f902e63941c3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756207 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-var-lib-cni-multus\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756232 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-etc-kubernetes\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.756666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756240 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-var-lib-cni-multus\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756257 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-etc-selinux\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756276 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-etc-kubernetes\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756308 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7201191d-577f-470c-81dc-ec7f86680c09-hosts-file\") pod \"node-resolver-t72lj\" (UID: \"7201191d-577f-470c-81dc-ec7f86680c09\") " pod="openshift-dns/node-resolver-t72lj" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756350 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/000763ee-232e-428c-84f2-4ca88f559d17-env-overrides\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756368 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-system-cni-dir\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756377 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/000763ee-232e-428c-84f2-4ca88f559d17-ovnkube-script-lib\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756392 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-var-lib-kubelet\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756407 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7201191d-577f-470c-81dc-ec7f86680c09-hosts-file\") pod \"node-resolver-t72lj\" (UID: \"7201191d-577f-470c-81dc-ec7f86680c09\") " pod="openshift-dns/node-resolver-t72lj" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756408 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-etc-selinux\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756428 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f2edab0e-3795-4fcc-9d28-b2979d98277c-iptables-alerter-script\") pod \"iptables-alerter-t8r9c\" (UID: \"f2edab0e-3795-4fcc-9d28-b2979d98277c\") " pod="openshift-network-operator/iptables-alerter-t8r9c" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756426 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-var-lib-kubelet\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756458 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-device-dir\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756464 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-system-cni-dir\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756491 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-sysconfig\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756509 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-device-dir\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756514 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-sysctl-conf\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.757409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756539 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-var-lib-kubelet\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756554 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-sysconfig\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756560 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-sys\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756583 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48jqx\" (UniqueName: \"kubernetes.io/projected/7201191d-577f-470c-81dc-ec7f86680c09-kube-api-access-48jqx\") pod \"node-resolver-t72lj\" (UID: \"7201191d-577f-470c-81dc-ec7f86680c09\") " pod="openshift-dns/node-resolver-t72lj" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756606 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-systemd-units\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756618 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-var-lib-kubelet\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756629 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b6c011cb-eacd-4c0a-88d4-f902e63941c3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756631 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-run-netns\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756662 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-systemd-units\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756657 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b6c011cb-eacd-4c0a-88d4-f902e63941c3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756671 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-sysctl-conf\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756681 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-run-ovn\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756706 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55307413-a629-4893-b816-dd674a0d602f-multus-daemon-config\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756716 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-sys\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756722 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-run\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756583 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55307413-a629-4893-b816-dd674a0d602f-cni-binary-copy\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756731 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-run-ovn\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756772 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-run-netns\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.758116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756775 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-cni-netd\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756797 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-run\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756808 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-multus-socket-dir-parent\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756812 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-cni-netd\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756833 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6csx\" (UniqueName: \"kubernetes.io/projected/55307413-a629-4893-b816-dd674a0d602f-kube-api-access-k6csx\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756849 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-multus-socket-dir-parent\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756859 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f2edab0e-3795-4fcc-9d28-b2979d98277c-host-slash\") pod \"iptables-alerter-t8r9c\" (UID: \"f2edab0e-3795-4fcc-9d28-b2979d98277c\") " pod="openshift-network-operator/iptables-alerter-t8r9c" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756857 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/000763ee-232e-428c-84f2-4ca88f559d17-env-overrides\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756905 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f2edab0e-3795-4fcc-9d28-b2979d98277c-host-slash\") pod \"iptables-alerter-t8r9c\" (UID: \"f2edab0e-3795-4fcc-9d28-b2979d98277c\") " pod="openshift-network-operator/iptables-alerter-t8r9c" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756941 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-socket-dir\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756964 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f2edab0e-3795-4fcc-9d28-b2979d98277c-iptables-alerter-script\") pod \"iptables-alerter-t8r9c\" (UID: \"f2edab0e-3795-4fcc-9d28-b2979d98277c\") " pod="openshift-network-operator/iptables-alerter-t8r9c" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756970 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-modprobe-d\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.756995 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b6c011cb-eacd-4c0a-88d4-f902e63941c3-cnibin\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757020 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b6c011cb-eacd-4c0a-88d4-f902e63941c3-os-release\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757047 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t88bl\" (UniqueName: \"kubernetes.io/projected/86a4e942-ea9e-4978-b92e-c96688b972a3-kube-api-access-t88bl\") pod \"network-metrics-daemon-j8c9k\" (UID: \"86a4e942-ea9e-4978-b92e-c96688b972a3\") " pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757052 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b6c011cb-eacd-4c0a-88d4-f902e63941c3-cnibin\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757073 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-tmp\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.758896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757086 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-socket-dir\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757090 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b6c011cb-eacd-4c0a-88d4-f902e63941c3-os-release\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757096 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-log-socket\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757087 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-modprobe-d\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757137 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55307413-a629-4893-b816-dd674a0d602f-multus-daemon-config\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757145 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-os-release\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757160 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-log-socket\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757170 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-host\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757186 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4r7m\" (UniqueName: \"kubernetes.io/projected/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-kube-api-access-w4r7m\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757204 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-os-release\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757207 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b6c011cb-eacd-4c0a-88d4-f902e63941c3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757236 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/000763ee-232e-428c-84f2-4ca88f559d17-ovnkube-config\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757237 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-host\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757312 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468-serviceca\") pod \"node-ca-8b2mt\" (UID: \"ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468\") " pod="openshift-image-registry/node-ca-8b2mt" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757325 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b6c011cb-eacd-4c0a-88d4-f902e63941c3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757339 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-hostroot\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757365 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8d78\" (UniqueName: \"kubernetes.io/projected/f2edab0e-3795-4fcc-9d28-b2979d98277c-kube-api-access-l8d78\") pod \"iptables-alerter-t8r9c\" (UID: \"f2edab0e-3795-4fcc-9d28-b2979d98277c\") " pod="openshift-network-operator/iptables-alerter-t8r9c" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757388 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.759696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757407 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-hostroot\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757412 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-tuned\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757402 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757448 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-etc-openvswitch\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757473 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-run-openvswitch\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757484 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3b76653-340c-4572-9e21-939d7f3ef9ae-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757493 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/000763ee-232e-428c-84f2-4ca88f559d17-ovn-node-metrics-cert\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757516 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mslv8\" (UniqueName: \"kubernetes.io/projected/000763ee-232e-428c-84f2-4ca88f559d17-kube-api-access-mslv8\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757531 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-etc-openvswitch\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757540 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/39147d66-f51d-40db-a98e-ac955007f9af-konnectivity-ca\") pod \"konnectivity-agent-ljbjz\" (UID: \"39147d66-f51d-40db-a98e-ac955007f9af\") " pod="kube-system/konnectivity-agent-ljbjz" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757566 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-sysctl-d\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757603 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/39147d66-f51d-40db-a98e-ac955007f9af-agent-certs\") pod \"konnectivity-agent-ljbjz\" (UID: \"39147d66-f51d-40db-a98e-ac955007f9af\") " pod="kube-system/konnectivity-agent-ljbjz" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757632 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468-host\") pod \"node-ca-8b2mt\" (UID: \"ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468\") " pod="openshift-image-registry/node-ca-8b2mt" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757656 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-multus-cni-dir\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757679 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-cnibin\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757683 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468-serviceca\") pod \"node-ca-8b2mt\" (UID: \"ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468\") " pod="openshift-image-registry/node-ca-8b2mt" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757712 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/000763ee-232e-428c-84f2-4ca88f559d17-ovnkube-config\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757703 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-multus-conf-dir\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.760449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757736 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468-host\") pod \"node-ca-8b2mt\" (UID: \"ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468\") " pod="openshift-image-registry/node-ca-8b2mt" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757761 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-run-multus-certs\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757772 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-cnibin\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757787 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54x8q\" (UniqueName: \"kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q\") pod \"network-check-target-pt7fs\" (UID: \"04d33160-cee6-4aaf-ab79-d806da372e92\") " pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757810 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7201191d-577f-470c-81dc-ec7f86680c09-tmp-dir\") pod \"node-resolver-t72lj\" (UID: \"7201191d-577f-470c-81dc-ec7f86680c09\") " pod="openshift-dns/node-resolver-t72lj" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757835 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-run-systemd\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757842 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-multus-cni-dir\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757859 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-cni-bin\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757887 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jch6\" (UniqueName: \"kubernetes.io/projected/ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468-kube-api-access-6jch6\") pod \"node-ca-8b2mt\" (UID: \"ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468\") " pod="openshift-image-registry/node-ca-8b2mt" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757887 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-multus-conf-dir\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757963 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-run-openvswitch\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.758106 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-run-systemd\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.757812 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55307413-a629-4893-b816-dd674a0d602f-host-run-multus-certs\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.758107 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/39147d66-f51d-40db-a98e-ac955007f9af-konnectivity-ca\") pod \"konnectivity-agent-ljbjz\" (UID: \"39147d66-f51d-40db-a98e-ac955007f9af\") " pod="kube-system/konnectivity-agent-ljbjz" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.758146 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/000763ee-232e-428c-84f2-4ca88f559d17-host-cni-bin\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.758199 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7201191d-577f-470c-81dc-ec7f86680c09-tmp-dir\") pod \"node-resolver-t72lj\" (UID: \"7201191d-577f-470c-81dc-ec7f86680c09\") " pod="openshift-dns/node-resolver-t72lj" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.758206 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-sysctl-d\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.760333 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-tmp\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.760931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.760376 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-etc-tuned\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.761428 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.760506 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/000763ee-232e-428c-84f2-4ca88f559d17-ovn-node-metrics-cert\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.761428 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.760536 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/39147d66-f51d-40db-a98e-ac955007f9af-agent-certs\") pod \"konnectivity-agent-ljbjz\" (UID: \"39147d66-f51d-40db-a98e-ac955007f9af\") " pod="kube-system/konnectivity-agent-ljbjz" Apr 20 21:47:12.769095 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:12.769068 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:47:12.769095 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:12.769096 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:47:12.769265 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:12.769117 2566 projected.go:194] Error preparing data for projected volume kube-api-access-54x8q for pod openshift-network-diagnostics/network-check-target-pt7fs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:12.769265 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:12.769184 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q podName:04d33160-cee6-4aaf-ab79-d806da372e92 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:13.269167034 +0000 UTC m=+2.077854400 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-54x8q" (UniqueName: "kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q") pod "network-check-target-pt7fs" (UID: "04d33160-cee6-4aaf-ab79-d806da372e92") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:12.769265 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.769192 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48jqx\" (UniqueName: \"kubernetes.io/projected/7201191d-577f-470c-81dc-ec7f86680c09-kube-api-access-48jqx\") pod \"node-resolver-t72lj\" (UID: \"7201191d-577f-470c-81dc-ec7f86680c09\") " pod="openshift-dns/node-resolver-t72lj" Apr 20 21:47:12.769989 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.769772 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjt5l\" (UniqueName: \"kubernetes.io/projected/b6c011cb-eacd-4c0a-88d4-f902e63941c3-kube-api-access-kjt5l\") pod \"multus-additional-cni-plugins-f9k89\" (UID: \"b6c011cb-eacd-4c0a-88d4-f902e63941c3\") " pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:12.769989 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.769787 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkqkx\" (UniqueName: \"kubernetes.io/projected/a3b76653-340c-4572-9e21-939d7f3ef9ae-kube-api-access-zkqkx\") pod \"aws-ebs-csi-driver-node-8t5zf\" (UID: \"a3b76653-340c-4572-9e21-939d7f3ef9ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.771064 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.770990 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8d78\" (UniqueName: \"kubernetes.io/projected/f2edab0e-3795-4fcc-9d28-b2979d98277c-kube-api-access-l8d78\") pod \"iptables-alerter-t8r9c\" (UID: \"f2edab0e-3795-4fcc-9d28-b2979d98277c\") " pod="openshift-network-operator/iptables-alerter-t8r9c" Apr 20 21:47:12.771212 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.771191 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jch6\" (UniqueName: \"kubernetes.io/projected/ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468-kube-api-access-6jch6\") pod \"node-ca-8b2mt\" (UID: \"ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468\") " pod="openshift-image-registry/node-ca-8b2mt" Apr 20 21:47:12.771693 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.771602 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4r7m\" (UniqueName: \"kubernetes.io/projected/3738fa41-3050-4602-8e4b-1cf6cf9e7e42-kube-api-access-w4r7m\") pod \"tuned-jrqrk\" (UID: \"3738fa41-3050-4602-8e4b-1cf6cf9e7e42\") " pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:12.771693 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.771673 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t88bl\" (UniqueName: \"kubernetes.io/projected/86a4e942-ea9e-4978-b92e-c96688b972a3-kube-api-access-t88bl\") pod \"network-metrics-daemon-j8c9k\" (UID: \"86a4e942-ea9e-4978-b92e-c96688b972a3\") " pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:12.772117 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.772072 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal" event={"ID":"aeca6dfa04aa73a6b44d98f09ffc3dbc","Type":"ContainerStarted","Data":"5a94d034a1a2d9a77fce3d841a9790684906bd719a2018a14d76fc75f7b5fd40"} Apr 20 21:47:12.772236 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.772219 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6csx\" (UniqueName: \"kubernetes.io/projected/55307413-a629-4893-b816-dd674a0d602f-kube-api-access-k6csx\") pod \"multus-swrdw\" (UID: \"55307413-a629-4893-b816-dd674a0d602f\") " pod="openshift-multus/multus-swrdw" Apr 20 21:47:12.772236 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.772226 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mslv8\" (UniqueName: \"kubernetes.io/projected/000763ee-232e-428c-84f2-4ca88f559d17-kube-api-access-mslv8\") pod \"ovnkube-node-jvkzb\" (UID: \"000763ee-232e-428c-84f2-4ca88f559d17\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:12.772965 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.772946 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-199.ec2.internal" event={"ID":"22aaf800e14f7250fca48df51b47e1cc","Type":"ContainerStarted","Data":"fb3679f1deeebb58afff1d9898efb1fcfcdbc971cd7d17b753cc197dd7524d47"} Apr 20 21:47:12.965602 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.965573 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t8r9c" Apr 20 21:47:12.971803 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:12.971780 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2edab0e_3795_4fcc_9d28_b2979d98277c.slice/crio-55d3a8645a5836891d83de81119e7f4267fe08637422b20cbc911cf1cc63cc6e WatchSource:0}: Error finding container 55d3a8645a5836891d83de81119e7f4267fe08637422b20cbc911cf1cc63cc6e: Status 404 returned error can't find the container with id 55d3a8645a5836891d83de81119e7f4267fe08637422b20cbc911cf1cc63cc6e Apr 20 21:47:12.979238 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.979223 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" Apr 20 21:47:12.984859 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:12.984838 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3b76653_340c_4572_9e21_939d7f3ef9ae.slice/crio-3eff56ce4b16a2896b59fd72a5dd30c61ab3598c837d53c37dd07ef1ea59a471 WatchSource:0}: Error finding container 3eff56ce4b16a2896b59fd72a5dd30c61ab3598c837d53c37dd07ef1ea59a471: Status 404 returned error can't find the container with id 3eff56ce4b16a2896b59fd72a5dd30c61ab3598c837d53c37dd07ef1ea59a471 Apr 20 21:47:12.998068 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:12.998047 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" Apr 20 21:47:13.002991 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:13.002968 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3738fa41_3050_4602_8e4b_1cf6cf9e7e42.slice/crio-b3014fa061b5027abb82afc9843bab039802fb1978970ea88bc3ca345d7b9fea WatchSource:0}: Error finding container b3014fa061b5027abb82afc9843bab039802fb1978970ea88bc3ca345d7b9fea: Status 404 returned error can't find the container with id b3014fa061b5027abb82afc9843bab039802fb1978970ea88bc3ca345d7b9fea Apr 20 21:47:13.005592 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.005578 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t72lj" Apr 20 21:47:13.011239 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:13.011218 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7201191d_577f_470c_81dc_ec7f86680c09.slice/crio-884d9eed547dff1acfaf42161829d3a5666c8e43610c709a6f8264e814df24da WatchSource:0}: Error finding container 884d9eed547dff1acfaf42161829d3a5666c8e43610c709a6f8264e814df24da: Status 404 returned error can't find the container with id 884d9eed547dff1acfaf42161829d3a5666c8e43610c709a6f8264e814df24da Apr 20 21:47:13.012188 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.012167 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:13.017181 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.017161 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ljbjz" Apr 20 21:47:13.019065 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:13.019029 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod000763ee_232e_428c_84f2_4ca88f559d17.slice/crio-edb3d487809c174a78af8eb8a5709cf68a433021f13cf54b6a9ac4e5ba66aedf WatchSource:0}: Error finding container edb3d487809c174a78af8eb8a5709cf68a433021f13cf54b6a9ac4e5ba66aedf: Status 404 returned error can't find the container with id edb3d487809c174a78af8eb8a5709cf68a433021f13cf54b6a9ac4e5ba66aedf Apr 20 21:47:13.023614 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.023346 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8b2mt" Apr 20 21:47:13.028429 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.028328 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f9k89" Apr 20 21:47:13.032756 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.032718 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-swrdw" Apr 20 21:47:13.033659 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:13.033370 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad1b62dc_e2e3_4ab8_95b4_0ee7f0e09468.slice/crio-059b5d7b6df33518d885921a3fe913a35bd8f59686f6053c3411b699c844e7ba WatchSource:0}: Error finding container 059b5d7b6df33518d885921a3fe913a35bd8f59686f6053c3411b699c844e7ba: Status 404 returned error can't find the container with id 059b5d7b6df33518d885921a3fe913a35bd8f59686f6053c3411b699c844e7ba Apr 20 21:47:13.038110 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:47:13.038083 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6c011cb_eacd_4c0a_88d4_f902e63941c3.slice/crio-4baeffd5315146c39ae075bf86dc67953da4f3d59fb8596810f50d14505189fa WatchSource:0}: Error finding container 4baeffd5315146c39ae075bf86dc67953da4f3d59fb8596810f50d14505189fa: Status 404 returned error can't find the container with id 4baeffd5315146c39ae075bf86dc67953da4f3d59fb8596810f50d14505189fa Apr 20 21:47:13.261328 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.261262 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs\") pod \"network-metrics-daemon-j8c9k\" (UID: \"86a4e942-ea9e-4978-b92e-c96688b972a3\") " pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:13.261494 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:13.261386 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:13.261494 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:13.261462 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs podName:86a4e942-ea9e-4978-b92e-c96688b972a3 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:14.261428786 +0000 UTC m=+3.070116131 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs") pod "network-metrics-daemon-j8c9k" (UID: "86a4e942-ea9e-4978-b92e-c96688b972a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:13.361834 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.361757 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54x8q\" (UniqueName: \"kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q\") pod \"network-check-target-pt7fs\" (UID: \"04d33160-cee6-4aaf-ab79-d806da372e92\") " pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:13.361984 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:13.361965 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:47:13.362049 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:13.361995 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:47:13.362049 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:13.362007 2566 projected.go:194] Error preparing data for projected volume kube-api-access-54x8q for pod openshift-network-diagnostics/network-check-target-pt7fs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:13.362153 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:13.362065 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q podName:04d33160-cee6-4aaf-ab79-d806da372e92 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:14.362045785 +0000 UTC m=+3.170733131 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-54x8q" (UniqueName: "kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q") pod "network-check-target-pt7fs" (UID: "04d33160-cee6-4aaf-ab79-d806da372e92") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:13.628053 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.627975 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-twt7t"] Apr 20 21:47:13.630211 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.630183 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:13.630340 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:13.630268 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:13.664724 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.664538 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d0ce3c41-e846-4b03-82b0-0fae9d903232-kubelet-config\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:13.664724 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.664580 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d0ce3c41-e846-4b03-82b0-0fae9d903232-dbus\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:13.664724 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.664639 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:13.683970 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.683890 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 21:42:12 +0000 UTC" deadline="2027-11-17 10:59:22.95653494 +0000 UTC" Apr 20 21:47:13.683970 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.683923 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13813h12m9.272615722s" Apr 20 21:47:13.690184 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.689848 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:47:13.727718 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.727694 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:47:13.765615 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.765584 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:13.765755 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.765659 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d0ce3c41-e846-4b03-82b0-0fae9d903232-kubelet-config\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:13.765755 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.765686 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d0ce3c41-e846-4b03-82b0-0fae9d903232-dbus\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:13.765871 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.765859 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d0ce3c41-e846-4b03-82b0-0fae9d903232-dbus\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:13.765994 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:13.765972 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:13.766096 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:13.766035 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret podName:d0ce3c41-e846-4b03-82b0-0fae9d903232 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:14.266016005 +0000 UTC m=+3.074703362 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret") pod "global-pull-secret-syncer-twt7t" (UID: "d0ce3c41-e846-4b03-82b0-0fae9d903232") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:13.766167 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.766103 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d0ce3c41-e846-4b03-82b0-0fae9d903232-kubelet-config\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:13.790453 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.790415 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" event={"ID":"000763ee-232e-428c-84f2-4ca88f559d17","Type":"ContainerStarted","Data":"edb3d487809c174a78af8eb8a5709cf68a433021f13cf54b6a9ac4e5ba66aedf"} Apr 20 21:47:13.800847 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.800815 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" event={"ID":"a3b76653-340c-4572-9e21-939d7f3ef9ae","Type":"ContainerStarted","Data":"3eff56ce4b16a2896b59fd72a5dd30c61ab3598c837d53c37dd07ef1ea59a471"} Apr 20 21:47:13.802656 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.802629 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t72lj" event={"ID":"7201191d-577f-470c-81dc-ec7f86680c09","Type":"ContainerStarted","Data":"884d9eed547dff1acfaf42161829d3a5666c8e43610c709a6f8264e814df24da"} Apr 20 21:47:13.805524 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.805500 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" event={"ID":"3738fa41-3050-4602-8e4b-1cf6cf9e7e42","Type":"ContainerStarted","Data":"b3014fa061b5027abb82afc9843bab039802fb1978970ea88bc3ca345d7b9fea"} Apr 20 21:47:13.813358 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.811935 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t8r9c" event={"ID":"f2edab0e-3795-4fcc-9d28-b2979d98277c","Type":"ContainerStarted","Data":"55d3a8645a5836891d83de81119e7f4267fe08637422b20cbc911cf1cc63cc6e"} Apr 20 21:47:13.816851 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.816810 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-swrdw" event={"ID":"55307413-a629-4893-b816-dd674a0d602f","Type":"ContainerStarted","Data":"c4a524ee27dec2de95981291c1910916c94afc3ee448b529d11905a523c8bfde"} Apr 20 21:47:13.821585 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.821533 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9k89" event={"ID":"b6c011cb-eacd-4c0a-88d4-f902e63941c3","Type":"ContainerStarted","Data":"4baeffd5315146c39ae075bf86dc67953da4f3d59fb8596810f50d14505189fa"} Apr 20 21:47:13.828035 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.827985 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8b2mt" event={"ID":"ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468","Type":"ContainerStarted","Data":"059b5d7b6df33518d885921a3fe913a35bd8f59686f6053c3411b699c844e7ba"} Apr 20 21:47:13.840651 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:13.840608 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ljbjz" event={"ID":"39147d66-f51d-40db-a98e-ac955007f9af","Type":"ContainerStarted","Data":"6f4ce1b98fc290fe8fc9bcfd3b7362aa276c045056c3e695235b44554938722b"} Apr 20 21:47:14.109519 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:14.109491 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:47:14.270555 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:14.270510 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:14.270741 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:14.270582 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs\") pod \"network-metrics-daemon-j8c9k\" (UID: \"86a4e942-ea9e-4978-b92e-c96688b972a3\") " pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:14.270741 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:14.270725 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:14.270855 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:14.270791 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs podName:86a4e942-ea9e-4978-b92e-c96688b972a3 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:16.27076938 +0000 UTC m=+5.079456744 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs") pod "network-metrics-daemon-j8c9k" (UID: "86a4e942-ea9e-4978-b92e-c96688b972a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:14.271217 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:14.271195 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:14.271344 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:14.271254 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret podName:d0ce3c41-e846-4b03-82b0-0fae9d903232 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:15.271238929 +0000 UTC m=+4.079926272 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret") pod "global-pull-secret-syncer-twt7t" (UID: "d0ce3c41-e846-4b03-82b0-0fae9d903232") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:14.372265 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:14.371611 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54x8q\" (UniqueName: \"kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q\") pod \"network-check-target-pt7fs\" (UID: \"04d33160-cee6-4aaf-ab79-d806da372e92\") " pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:14.372265 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:14.371784 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:47:14.372265 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:14.371804 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:47:14.372265 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:14.371817 2566 projected.go:194] Error preparing data for projected volume kube-api-access-54x8q for pod openshift-network-diagnostics/network-check-target-pt7fs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:14.372265 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:14.371874 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q podName:04d33160-cee6-4aaf-ab79-d806da372e92 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:16.371857098 +0000 UTC m=+5.180544458 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-54x8q" (UniqueName: "kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q") pod "network-check-target-pt7fs" (UID: "04d33160-cee6-4aaf-ab79-d806da372e92") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:14.684815 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:14.684712 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 21:42:12 +0000 UTC" deadline="2028-02-03 09:13:00.750440696 +0000 UTC" Apr 20 21:47:14.684815 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:14.684746 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15683h25m46.065698607s" Apr 20 21:47:14.770437 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:14.769752 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:14.770437 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:14.769870 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:14.770437 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:14.770296 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:14.770437 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:14.770398 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:15.281314 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:15.281261 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:15.281488 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:15.281433 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:15.281566 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:15.281489 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret podName:d0ce3c41-e846-4b03-82b0-0fae9d903232 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:17.281472568 +0000 UTC m=+6.090159922 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret") pod "global-pull-secret-syncer-twt7t" (UID: "d0ce3c41-e846-4b03-82b0-0fae9d903232") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:15.769314 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:15.769269 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:15.769766 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:15.769429 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:16.289159 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:16.289116 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs\") pod \"network-metrics-daemon-j8c9k\" (UID: \"86a4e942-ea9e-4978-b92e-c96688b972a3\") " pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:16.289351 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:16.289327 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:16.289422 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:16.289395 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs podName:86a4e942-ea9e-4978-b92e-c96688b972a3 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:20.289374031 +0000 UTC m=+9.098061391 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs") pod "network-metrics-daemon-j8c9k" (UID: "86a4e942-ea9e-4978-b92e-c96688b972a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:16.389942 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:16.389903 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54x8q\" (UniqueName: \"kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q\") pod \"network-check-target-pt7fs\" (UID: \"04d33160-cee6-4aaf-ab79-d806da372e92\") " pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:16.390223 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:16.390102 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:47:16.390223 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:16.390131 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:47:16.390223 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:16.390145 2566 projected.go:194] Error preparing data for projected volume kube-api-access-54x8q for pod openshift-network-diagnostics/network-check-target-pt7fs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:16.390223 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:16.390203 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q podName:04d33160-cee6-4aaf-ab79-d806da372e92 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:20.390184588 +0000 UTC m=+9.198871951 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-54x8q" (UniqueName: "kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q") pod "network-check-target-pt7fs" (UID: "04d33160-cee6-4aaf-ab79-d806da372e92") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:16.769850 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:16.769821 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:16.770356 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:16.769958 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:16.770356 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:16.770042 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:16.770356 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:16.770166 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:17.299021 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:17.298914 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:17.299206 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:17.299107 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:17.299206 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:17.299173 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret podName:d0ce3c41-e846-4b03-82b0-0fae9d903232 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:21.299153398 +0000 UTC m=+10.107840743 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret") pod "global-pull-secret-syncer-twt7t" (UID: "d0ce3c41-e846-4b03-82b0-0fae9d903232") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:17.769624 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:17.769481 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:17.769624 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:17.769610 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:18.769655 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:18.769621 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:18.770206 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:18.769746 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:18.770206 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:18.769823 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:18.770206 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:18.769951 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:19.769825 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:19.769794 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:19.770321 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:19.769930 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:20.324064 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:20.323992 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs\") pod \"network-metrics-daemon-j8c9k\" (UID: \"86a4e942-ea9e-4978-b92e-c96688b972a3\") " pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:20.324256 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:20.324158 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:20.324256 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:20.324234 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs podName:86a4e942-ea9e-4978-b92e-c96688b972a3 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:28.324209197 +0000 UTC m=+17.132896547 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs") pod "network-metrics-daemon-j8c9k" (UID: "86a4e942-ea9e-4978-b92e-c96688b972a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:20.424948 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:20.424909 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54x8q\" (UniqueName: \"kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q\") pod \"network-check-target-pt7fs\" (UID: \"04d33160-cee6-4aaf-ab79-d806da372e92\") " pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:20.425137 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:20.425115 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:47:20.425204 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:20.425142 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:47:20.425204 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:20.425155 2566 projected.go:194] Error preparing data for projected volume kube-api-access-54x8q for pod openshift-network-diagnostics/network-check-target-pt7fs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:20.425320 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:20.425217 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q podName:04d33160-cee6-4aaf-ab79-d806da372e92 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:28.425199426 +0000 UTC m=+17.233886782 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-54x8q" (UniqueName: "kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q") pod "network-check-target-pt7fs" (UID: "04d33160-cee6-4aaf-ab79-d806da372e92") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:20.769371 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:20.769290 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:20.769544 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:20.769431 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:20.769951 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:20.769786 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:20.769951 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:20.769890 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:21.333301 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:21.333252 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:21.333492 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:21.333430 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:21.333492 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:21.333484 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret podName:d0ce3c41-e846-4b03-82b0-0fae9d903232 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:29.333469912 +0000 UTC m=+18.142157254 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret") pod "global-pull-secret-syncer-twt7t" (UID: "d0ce3c41-e846-4b03-82b0-0fae9d903232") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:21.773437 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:21.771646 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:21.773437 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:21.771794 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:22.770140 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:22.770102 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:22.770324 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:22.770103 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:22.770324 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:22.770258 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:22.770460 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:22.770327 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:23.769621 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:23.769588 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:23.770069 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:23.769730 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:24.769398 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:24.769366 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:24.769572 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:24.769369 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:24.769572 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:24.769476 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:24.769572 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:24.769564 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:25.770034 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:25.770001 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:25.770493 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:25.770125 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:26.769560 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:26.769522 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:26.769732 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:26.769522 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:26.769732 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:26.769645 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:26.769845 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:26.769758 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:27.770182 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:27.770147 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:27.770620 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:27.770291 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:28.387221 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:28.387186 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs\") pod \"network-metrics-daemon-j8c9k\" (UID: \"86a4e942-ea9e-4978-b92e-c96688b972a3\") " pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:28.387430 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:28.387333 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:28.387430 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:28.387393 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs podName:86a4e942-ea9e-4978-b92e-c96688b972a3 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:44.387374468 +0000 UTC m=+33.196061813 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs") pod "network-metrics-daemon-j8c9k" (UID: "86a4e942-ea9e-4978-b92e-c96688b972a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:28.487546 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:28.487502 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54x8q\" (UniqueName: \"kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q\") pod \"network-check-target-pt7fs\" (UID: \"04d33160-cee6-4aaf-ab79-d806da372e92\") " pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:28.487719 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:28.487653 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:47:28.487719 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:28.487680 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:47:28.487719 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:28.487694 2566 projected.go:194] Error preparing data for projected volume kube-api-access-54x8q for pod openshift-network-diagnostics/network-check-target-pt7fs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:28.487850 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:28.487753 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q podName:04d33160-cee6-4aaf-ab79-d806da372e92 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:44.487733607 +0000 UTC m=+33.296420949 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-54x8q" (UniqueName: "kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q") pod "network-check-target-pt7fs" (UID: "04d33160-cee6-4aaf-ab79-d806da372e92") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:28.769450 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:28.769415 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:28.769636 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:28.769416 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:28.769636 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:28.769532 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:28.769636 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:28.769622 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:29.394181 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:29.394140 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:29.394577 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:29.394251 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:29.394577 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:29.394323 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret podName:d0ce3c41-e846-4b03-82b0-0fae9d903232 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:45.394308727 +0000 UTC m=+34.202996075 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret") pod "global-pull-secret-syncer-twt7t" (UID: "d0ce3c41-e846-4b03-82b0-0fae9d903232") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:29.769751 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:29.769496 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:29.769913 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:29.769849 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:30.769388 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:30.769350 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:30.769757 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:30.769458 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:30.769757 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:30.769511 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:30.769757 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:30.769605 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:31.770186 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.770015 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:31.770770 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:31.770291 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:31.875320 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.875275 2566 generic.go:358] "Generic (PLEG): container finished" podID="b6c011cb-eacd-4c0a-88d4-f902e63941c3" containerID="fc1afbdbdbd41f1297aa987d88c34d1b0d92ed5bcaf4d32658e34fcfb3fc9072" exitCode=0 Apr 20 21:47:31.875456 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.875356 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9k89" event={"ID":"b6c011cb-eacd-4c0a-88d4-f902e63941c3","Type":"ContainerDied","Data":"fc1afbdbdbd41f1297aa987d88c34d1b0d92ed5bcaf4d32658e34fcfb3fc9072"} Apr 20 21:47:31.876815 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.876724 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8b2mt" event={"ID":"ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468","Type":"ContainerStarted","Data":"028b3513d4b20c8deb80cd0b6decbe7b5623e18bdbb2d7ad3738a5699379d66c"} Apr 20 21:47:31.877993 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.877968 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ljbjz" event={"ID":"39147d66-f51d-40db-a98e-ac955007f9af","Type":"ContainerStarted","Data":"8e28fc1d043bfa053c8556ce97ec7dfafe92a83fe457ca47f2d5442a6b81d407"} Apr 20 21:47:31.880373 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.880354 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 21:47:31.880669 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.880647 2566 generic.go:358] "Generic (PLEG): container finished" podID="000763ee-232e-428c-84f2-4ca88f559d17" containerID="fb23555fcdef4e1347c17d38fd319f4c4b6ec11de9437ea0eb94656678729b9e" exitCode=1 Apr 20 21:47:31.880736 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.880713 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" event={"ID":"000763ee-232e-428c-84f2-4ca88f559d17","Type":"ContainerStarted","Data":"4141e0d3d133f54c13143401c4fce44219101c4470f8735107bcf973dfe42753"} Apr 20 21:47:31.880785 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.880740 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" event={"ID":"000763ee-232e-428c-84f2-4ca88f559d17","Type":"ContainerStarted","Data":"f077a1dfb9a31c87f4ac22d1f42fd2b2ce058f461383ec76708b358723f8b28a"} Apr 20 21:47:31.880785 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.880753 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" event={"ID":"000763ee-232e-428c-84f2-4ca88f559d17","Type":"ContainerStarted","Data":"39154b37c43393f33aa9c76df17944429abcc595121abbe18f4fa999c7f745b9"} Apr 20 21:47:31.880785 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.880766 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" event={"ID":"000763ee-232e-428c-84f2-4ca88f559d17","Type":"ContainerStarted","Data":"bdf1179f3d797ec0d702da3e5a6ef1478a381d7d4926b68d686e5ec105857698"} Apr 20 21:47:31.880785 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.880778 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" event={"ID":"000763ee-232e-428c-84f2-4ca88f559d17","Type":"ContainerDied","Data":"fb23555fcdef4e1347c17d38fd319f4c4b6ec11de9437ea0eb94656678729b9e"} Apr 20 21:47:31.880946 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.880792 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" event={"ID":"000763ee-232e-428c-84f2-4ca88f559d17","Type":"ContainerStarted","Data":"f60c7e6e753c89a9636f9e461c117fdcb3313be38445410fe65baafb9d9e0718"} Apr 20 21:47:31.881876 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.881857 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" event={"ID":"a3b76653-340c-4572-9e21-939d7f3ef9ae","Type":"ContainerStarted","Data":"904d2f379b611c370d01519f59697ecde093868bf7a63a38cef62c042036cb85"} Apr 20 21:47:31.882920 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.882904 2566 generic.go:358] "Generic (PLEG): container finished" podID="aeca6dfa04aa73a6b44d98f09ffc3dbc" containerID="3a207e085b9c30b641ea0312bb4cff4dac5e2fd4fd7d56184fcda0ecde26c44d" exitCode=0 Apr 20 21:47:31.883009 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.882963 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal" event={"ID":"aeca6dfa04aa73a6b44d98f09ffc3dbc","Type":"ContainerDied","Data":"3a207e085b9c30b641ea0312bb4cff4dac5e2fd4fd7d56184fcda0ecde26c44d"} Apr 20 21:47:31.884556 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.884534 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-199.ec2.internal" event={"ID":"22aaf800e14f7250fca48df51b47e1cc","Type":"ContainerStarted","Data":"0b11fd9b5e5c099450f7b4d481b5bd386d4fa735b9dac45983531d7163204588"} Apr 20 21:47:31.885691 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.885666 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t72lj" event={"ID":"7201191d-577f-470c-81dc-ec7f86680c09","Type":"ContainerStarted","Data":"7c14ee668c1541074cbf7112dfec91c3441e31a9fb84d6f870b49d6a6c8bf23f"} Apr 20 21:47:31.886938 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.886915 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" event={"ID":"3738fa41-3050-4602-8e4b-1cf6cf9e7e42","Type":"ContainerStarted","Data":"3aa210ce11672a0dee0294356c270e1e990d3df54829b4d8c0f3f4c0df801064"} Apr 20 21:47:31.888096 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.888076 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-swrdw" event={"ID":"55307413-a629-4893-b816-dd674a0d602f","Type":"ContainerStarted","Data":"ea5bbcf6416b9efb1d18df4240595d1d4aaa9ad0df3bbe1da57e717fe59f9e04"} Apr 20 21:47:31.906879 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.906838 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jrqrk" podStartSLOduration=3.066523592 podStartE2EDuration="20.906826517s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.004297027 +0000 UTC m=+1.812984370" lastFinishedPulling="2026-04-20 21:47:30.844599948 +0000 UTC m=+19.653287295" observedRunningTime="2026-04-20 21:47:31.906731619 +0000 UTC m=+20.715418986" watchObservedRunningTime="2026-04-20 21:47:31.906826517 +0000 UTC m=+20.715513881" Apr 20 21:47:31.915824 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.915804 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ljbjz" Apr 20 21:47:31.916574 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.916556 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ljbjz" Apr 20 21:47:31.929936 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.929892 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-t72lj" podStartSLOduration=3.023619462 podStartE2EDuration="20.9298784s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.012766965 +0000 UTC m=+1.821454308" lastFinishedPulling="2026-04-20 21:47:30.919025904 +0000 UTC m=+19.727713246" observedRunningTime="2026-04-20 21:47:31.929844735 +0000 UTC m=+20.738532102" watchObservedRunningTime="2026-04-20 21:47:31.9298784 +0000 UTC m=+20.738565766" Apr 20 21:47:31.940983 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.940903 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ljbjz" podStartSLOduration=3.063075631 podStartE2EDuration="20.940895786s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.027538679 +0000 UTC m=+1.836226022" lastFinishedPulling="2026-04-20 21:47:30.905358825 +0000 UTC m=+19.714046177" observedRunningTime="2026-04-20 21:47:31.940747674 +0000 UTC m=+20.749435052" watchObservedRunningTime="2026-04-20 21:47:31.940895786 +0000 UTC m=+20.749583155" Apr 20 21:47:31.952391 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.952359 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8b2mt" podStartSLOduration=3.10331961 podStartE2EDuration="20.952348086s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.036983323 +0000 UTC m=+1.845670678" lastFinishedPulling="2026-04-20 21:47:30.886011796 +0000 UTC m=+19.694699154" observedRunningTime="2026-04-20 21:47:31.952060889 +0000 UTC m=+20.760748255" watchObservedRunningTime="2026-04-20 21:47:31.952348086 +0000 UTC m=+20.761035450" Apr 20 21:47:31.964662 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.964626 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-swrdw" podStartSLOduration=2.892003282 podStartE2EDuration="20.964619268s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.04487564 +0000 UTC m=+1.853562982" lastFinishedPulling="2026-04-20 21:47:31.117491611 +0000 UTC m=+19.926178968" observedRunningTime="2026-04-20 21:47:31.964543981 +0000 UTC m=+20.773231345" watchObservedRunningTime="2026-04-20 21:47:31.964619268 +0000 UTC m=+20.773306632" Apr 20 21:47:31.976155 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:31.975833 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-199.ec2.internal" podStartSLOduration=20.975820244 podStartE2EDuration="20.975820244s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:47:31.975762781 +0000 UTC m=+20.784450186" watchObservedRunningTime="2026-04-20 21:47:31.975820244 +0000 UTC m=+20.784507608" Apr 20 21:47:32.662064 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:32.662020 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 21:47:32.725478 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:32.725372 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T21:47:32.66203859Z","UUID":"e46e57df-f40e-4e46-b7ac-ee9691908458","Handler":null,"Name":"","Endpoint":""} Apr 20 21:47:32.727041 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:32.727021 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 21:47:32.727140 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:32.727048 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 21:47:32.769425 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:32.769398 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:32.769547 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:32.769429 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:32.769547 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:32.769528 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:32.769654 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:32.769638 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:32.891878 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:32.891800 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal" event={"ID":"aeca6dfa04aa73a6b44d98f09ffc3dbc","Type":"ContainerStarted","Data":"e6f1b4979de3e513ee1e2dfcb1ab6ea4dd6cd87f3e62701bc587333d35506b5e"} Apr 20 21:47:32.893355 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:32.893311 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t8r9c" event={"ID":"f2edab0e-3795-4fcc-9d28-b2979d98277c","Type":"ContainerStarted","Data":"c86dfd8a205b6aab0db88d9bff94f70f3a2b7e9013c4049d13d1eb58c6819734"} Apr 20 21:47:32.895247 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:32.895222 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" event={"ID":"a3b76653-340c-4572-9e21-939d7f3ef9ae","Type":"ContainerStarted","Data":"aca98a55191b20d453d274cb9bc9aab004007c28b4259e2eaf4ed488cec231a3"} Apr 20 21:47:32.895524 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:32.895500 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ljbjz" Apr 20 21:47:32.896195 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:32.896178 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ljbjz" Apr 20 21:47:32.904473 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:32.904433 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-199.ec2.internal" podStartSLOduration=21.904419724 podStartE2EDuration="21.904419724s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:47:32.90409586 +0000 UTC m=+21.712783231" watchObservedRunningTime="2026-04-20 21:47:32.904419724 +0000 UTC m=+21.713107091" Apr 20 21:47:32.915807 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:32.915765 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-t8r9c" podStartSLOduration=3.985102586 podStartE2EDuration="21.915749828s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="2026-04-20 21:47:12.972911629 +0000 UTC m=+1.781598972" lastFinishedPulling="2026-04-20 21:47:30.90355887 +0000 UTC m=+19.712246214" observedRunningTime="2026-04-20 21:47:32.915614713 +0000 UTC m=+21.724302077" watchObservedRunningTime="2026-04-20 21:47:32.915749828 +0000 UTC m=+21.724437194" Apr 20 21:47:33.769813 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:33.769736 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:33.769989 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:33.769866 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:33.899317 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:33.899270 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 21:47:33.899699 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:33.899639 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" event={"ID":"000763ee-232e-428c-84f2-4ca88f559d17","Type":"ContainerStarted","Data":"cbae2b0812fedd4ee49ec1c988e7a966363d998dc25525efda87c6af63737a46"} Apr 20 21:47:33.901665 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:33.901535 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" event={"ID":"a3b76653-340c-4572-9e21-939d7f3ef9ae","Type":"ContainerStarted","Data":"d72baa851a20b6b545d64520fc87305f88583e7e6eea586439455bb34822ce77"} Apr 20 21:47:33.919487 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:33.919438 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t5zf" podStartSLOduration=2.415903832 podStartE2EDuration="22.919427352s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="2026-04-20 21:47:12.986394374 +0000 UTC m=+1.795081717" lastFinishedPulling="2026-04-20 21:47:33.48991789 +0000 UTC m=+22.298605237" observedRunningTime="2026-04-20 21:47:33.919024886 +0000 UTC m=+22.727712250" watchObservedRunningTime="2026-04-20 21:47:33.919427352 +0000 UTC m=+22.728114757" Apr 20 21:47:34.769429 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:34.769391 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:34.769602 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:34.769509 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:34.769602 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:34.769568 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:34.769717 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:34.769673 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:35.769703 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:35.769673 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:35.770035 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:35.769778 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:35.908590 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:35.908417 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 21:47:35.908922 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:35.908898 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" event={"ID":"000763ee-232e-428c-84f2-4ca88f559d17","Type":"ContainerStarted","Data":"239e7a96804cbbc5d9a956e2a7281baca8a9d8f86834f43bb54147a1c9afa5f2"} Apr 20 21:47:35.909223 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:35.909202 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:35.909335 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:35.909310 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:35.909335 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:35.909335 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:35.909464 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:35.909350 2566 scope.go:117] "RemoveContainer" containerID="fb23555fcdef4e1347c17d38fd319f4c4b6ec11de9437ea0eb94656678729b9e" Apr 20 21:47:35.923322 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:35.923251 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:35.923401 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:35.923331 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:47:36.769717 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:36.769519 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:36.770524 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:36.769571 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:36.770524 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:36.769809 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:36.770524 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:36.769867 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:36.912628 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:36.912586 2566 generic.go:358] "Generic (PLEG): container finished" podID="b6c011cb-eacd-4c0a-88d4-f902e63941c3" containerID="66a6843c5589724ef6ca826bebdc0965cda51f73d589643c6f0f7ac666899587" exitCode=0 Apr 20 21:47:36.912783 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:36.912669 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9k89" event={"ID":"b6c011cb-eacd-4c0a-88d4-f902e63941c3","Type":"ContainerDied","Data":"66a6843c5589724ef6ca826bebdc0965cda51f73d589643c6f0f7ac666899587"} Apr 20 21:47:36.915935 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:36.915911 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 21:47:36.916231 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:36.916215 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" event={"ID":"000763ee-232e-428c-84f2-4ca88f559d17","Type":"ContainerStarted","Data":"6e4c18566f6732979411f68205f024570ae2e09c58ee2ae0c407ac02a73bd520"} Apr 20 21:47:36.958054 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:36.958022 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" podStartSLOduration=8.052349117 podStartE2EDuration="25.958009695s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.02154275 +0000 UTC m=+1.830230092" lastFinishedPulling="2026-04-20 21:47:30.927203323 +0000 UTC m=+19.735890670" observedRunningTime="2026-04-20 21:47:36.956530605 +0000 UTC m=+25.765217970" watchObservedRunningTime="2026-04-20 21:47:36.958009695 +0000 UTC m=+25.766697041" Apr 20 21:47:37.769481 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:37.769452 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:37.769576 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:37.769546 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:37.819090 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:37.819063 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pt7fs"] Apr 20 21:47:37.819520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:37.819157 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:37.819520 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:37.819243 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:37.819856 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:37.819835 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-twt7t"] Apr 20 21:47:37.821485 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:37.821463 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j8c9k"] Apr 20 21:47:37.821566 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:37.821541 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:37.821634 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:37.821620 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:37.919780 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:37.919717 2566 generic.go:358] "Generic (PLEG): container finished" podID="b6c011cb-eacd-4c0a-88d4-f902e63941c3" containerID="1ab352ec5108c478e8e9fda37f640c9ca76b79bcdfbe4b088e5ce513db7f40b7" exitCode=0 Apr 20 21:47:37.919902 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:37.919796 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:37.919902 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:37.919812 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9k89" event={"ID":"b6c011cb-eacd-4c0a-88d4-f902e63941c3","Type":"ContainerDied","Data":"1ab352ec5108c478e8e9fda37f640c9ca76b79bcdfbe4b088e5ce513db7f40b7"} Apr 20 21:47:37.920091 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:37.920064 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:38.923718 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:38.923680 2566 generic.go:358] "Generic (PLEG): container finished" podID="b6c011cb-eacd-4c0a-88d4-f902e63941c3" containerID="a2b33c2f6b5f9f8da0ce9a86b6f7d2550ff52ae4880c515cab73a64ccc85322e" exitCode=0 Apr 20 21:47:38.923718 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:38.923720 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9k89" event={"ID":"b6c011cb-eacd-4c0a-88d4-f902e63941c3","Type":"ContainerDied","Data":"a2b33c2f6b5f9f8da0ce9a86b6f7d2550ff52ae4880c515cab73a64ccc85322e"} Apr 20 21:47:39.769935 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:39.769904 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:39.769935 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:39.769923 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:39.770099 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:39.769943 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:39.770099 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:39.770049 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:39.770209 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:39.770159 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:39.770471 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:39.770430 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:41.770426 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:41.770149 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:41.770871 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:41.770265 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:41.770871 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:41.770517 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:41.770871 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:41.770312 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:41.770871 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:41.770651 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:41.770871 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:41.770853 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:43.769895 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:43.769866 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:43.770411 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:43.769873 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:43.770411 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:43.769986 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-twt7t" podUID="d0ce3c41-e846-4b03-82b0-0fae9d903232" Apr 20 21:47:43.770411 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:43.769873 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:43.770411 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:43.770062 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:47:43.770411 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:43.770143 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pt7fs" podUID="04d33160-cee6-4aaf-ab79-d806da372e92" Apr 20 21:47:44.025213 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.025121 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-199.ec2.internal" event="NodeReady" Apr 20 21:47:44.025383 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.025292 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 21:47:44.058088 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.058057 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7b5f674b89-f7wg6"] Apr 20 21:47:44.090020 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.089986 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cdxfg"] Apr 20 21:47:44.090184 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.090160 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.093706 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.092726 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 21:47:44.093706 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.093686 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 21:47:44.093877 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.093725 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 21:47:44.094181 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.094143 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rxx7z\"" Apr 20 21:47:44.099596 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.099574 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 21:47:44.106040 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.106021 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7nmnk"] Apr 20 21:47:44.106218 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.106203 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:44.108573 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.108553 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 21:47:44.108683 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.108659 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 21:47:44.108749 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.108712 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-s4j5s\"" Apr 20 21:47:44.124346 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.124239 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7b5f674b89-f7wg6"] Apr 20 21:47:44.124346 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.124337 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cdxfg"] Apr 20 21:47:44.124525 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.124347 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:47:44.124525 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.124353 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7nmnk"] Apr 20 21:47:44.126893 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.126875 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 21:47:44.126985 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.126880 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 21:47:44.127040 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.126990 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 21:47:44.127177 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.127154 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bqmdj\"" Apr 20 21:47:44.213549 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.213518 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:44.213686 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.213566 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-certificates\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.213686 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.213633 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-trusted-ca\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.213686 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.213671 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmcs7\" (UniqueName: \"kubernetes.io/projected/edf22b9c-7596-4c82-a080-dfbe98377c19-kube-api-access-pmcs7\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:44.213844 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.213706 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-bound-sa-token\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.213844 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.213751 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert\") pod \"ingress-canary-7nmnk\" (UID: \"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b\") " pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:47:44.213844 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.213777 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-installation-pull-secrets\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.213844 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.213805 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfbfv\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-kube-api-access-zfbfv\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.214024 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.213877 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88bnc\" (UniqueName: \"kubernetes.io/projected/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-kube-api-access-88bnc\") pod \"ingress-canary-7nmnk\" (UID: \"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b\") " pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:47:44.214024 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.213917 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/edf22b9c-7596-4c82-a080-dfbe98377c19-tmp-dir\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:44.214024 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.213939 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-image-registry-private-configuration\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.214024 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.213956 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-ca-trust-extracted\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.214024 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.213997 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/edf22b9c-7596-4c82-a080-dfbe98377c19-config-volume\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:44.214197 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.214034 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.315206 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.315130 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.315206 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.315176 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:44.315469 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.315210 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-certificates\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.315469 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.315236 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-trusted-ca\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.315469 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.315261 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmcs7\" (UniqueName: \"kubernetes.io/projected/edf22b9c-7596-4c82-a080-dfbe98377c19-kube-api-access-pmcs7\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:44.315469 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.315273 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:47:44.315469 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.315306 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-bound-sa-token\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.315469 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.315309 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b5f674b89-f7wg6: secret "image-registry-tls" not found Apr 20 21:47:44.315469 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.315368 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls podName:de9ba6bd-1329-40c2-b819-7dce1bdd20f0 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:44.815353577 +0000 UTC m=+33.624040933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls") pod "image-registry-7b5f674b89-f7wg6" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0") : secret "image-registry-tls" not found Apr 20 21:47:44.315469 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.315364 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:47:44.315469 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.315399 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls podName:edf22b9c-7596-4c82-a080-dfbe98377c19 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:44.815393393 +0000 UTC m=+33.624080737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls") pod "dns-default-cdxfg" (UID: "edf22b9c-7596-4c82-a080-dfbe98377c19") : secret "dns-default-metrics-tls" not found Apr 20 21:47:44.315469 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.315435 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert\") pod \"ingress-canary-7nmnk\" (UID: \"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b\") " pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:47:44.315469 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.315472 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-installation-pull-secrets\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.315992 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.315506 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfbfv\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-kube-api-access-zfbfv\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.315992 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.315550 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88bnc\" (UniqueName: \"kubernetes.io/projected/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-kube-api-access-88bnc\") pod \"ingress-canary-7nmnk\" (UID: \"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b\") " pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:47:44.315992 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.315613 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/edf22b9c-7596-4c82-a080-dfbe98377c19-tmp-dir\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:44.315992 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.315657 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-image-registry-private-configuration\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.315992 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.315560 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:47:44.315992 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.315686 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-ca-trust-extracted\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.315992 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.315722 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert podName:3c4537b6-b9be-4eb9-9b2e-2867dc27db2b nodeName:}" failed. No retries permitted until 2026-04-20 21:47:44.815703459 +0000 UTC m=+33.624390816 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert") pod "ingress-canary-7nmnk" (UID: "3c4537b6-b9be-4eb9-9b2e-2867dc27db2b") : secret "canary-serving-cert" not found Apr 20 21:47:44.315992 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.315744 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/edf22b9c-7596-4c82-a080-dfbe98377c19-config-volume\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:44.316425 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.316209 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-ca-trust-extracted\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.316425 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.316247 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/edf22b9c-7596-4c82-a080-dfbe98377c19-tmp-dir\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:44.316425 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.316335 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-trusted-ca\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.316425 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.316365 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/edf22b9c-7596-4c82-a080-dfbe98377c19-config-volume\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:44.316633 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.316614 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-certificates\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.319389 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.319368 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-installation-pull-secrets\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.319471 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.319372 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-image-registry-private-configuration\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.324645 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.324620 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-bound-sa-token\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.324645 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.324627 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmcs7\" (UniqueName: \"kubernetes.io/projected/edf22b9c-7596-4c82-a080-dfbe98377c19-kube-api-access-pmcs7\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:44.324777 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.324709 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfbfv\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-kube-api-access-zfbfv\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.324969 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.324950 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88bnc\" (UniqueName: \"kubernetes.io/projected/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-kube-api-access-88bnc\") pod \"ingress-canary-7nmnk\" (UID: \"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b\") " pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:47:44.416409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.416385 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs\") pod \"network-metrics-daemon-j8c9k\" (UID: \"86a4e942-ea9e-4978-b92e-c96688b972a3\") " pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:44.416523 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.416508 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:44.416576 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.416567 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs podName:86a4e942-ea9e-4978-b92e-c96688b972a3 nodeName:}" failed. No retries permitted until 2026-04-20 21:48:16.416554686 +0000 UTC m=+65.225242034 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs") pod "network-metrics-daemon-j8c9k" (UID: "86a4e942-ea9e-4978-b92e-c96688b972a3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:44.517607 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.517574 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54x8q\" (UniqueName: \"kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q\") pod \"network-check-target-pt7fs\" (UID: \"04d33160-cee6-4aaf-ab79-d806da372e92\") " pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:44.517733 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.517722 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:47:44.517769 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.517739 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:47:44.517769 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.517748 2566 projected.go:194] Error preparing data for projected volume kube-api-access-54x8q for pod openshift-network-diagnostics/network-check-target-pt7fs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:44.517833 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.517807 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q podName:04d33160-cee6-4aaf-ab79-d806da372e92 nodeName:}" failed. No retries permitted until 2026-04-20 21:48:16.517792579 +0000 UTC m=+65.326479924 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-54x8q" (UniqueName: "kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q") pod "network-check-target-pt7fs" (UID: "04d33160-cee6-4aaf-ab79-d806da372e92") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:44.820885 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.820858 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:44.821515 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.820918 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:44.821515 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.820962 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert\") pod \"ingress-canary-7nmnk\" (UID: \"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b\") " pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:47:44.821515 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.820985 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:47:44.821515 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.821002 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b5f674b89-f7wg6: secret "image-registry-tls" not found Apr 20 21:47:44.821515 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.821042 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls podName:de9ba6bd-1329-40c2-b819-7dce1bdd20f0 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:45.821028425 +0000 UTC m=+34.629715768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls") pod "image-registry-7b5f674b89-f7wg6" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0") : secret "image-registry-tls" not found Apr 20 21:47:44.821515 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.821073 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:47:44.821515 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.821077 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:47:44.821515 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.821133 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert podName:3c4537b6-b9be-4eb9-9b2e-2867dc27db2b nodeName:}" failed. No retries permitted until 2026-04-20 21:47:45.821117032 +0000 UTC m=+34.629804391 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert") pod "ingress-canary-7nmnk" (UID: "3c4537b6-b9be-4eb9-9b2e-2867dc27db2b") : secret "canary-serving-cert" not found Apr 20 21:47:44.821515 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:44.821148 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls podName:edf22b9c-7596-4c82-a080-dfbe98377c19 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:45.821140561 +0000 UTC m=+34.629827904 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls") pod "dns-default-cdxfg" (UID: "edf22b9c-7596-4c82-a080-dfbe98377c19") : secret "dns-default-metrics-tls" not found Apr 20 21:47:44.937810 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:44.937785 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9k89" event={"ID":"b6c011cb-eacd-4c0a-88d4-f902e63941c3","Type":"ContainerStarted","Data":"72e3dde4f40bf82dbc59402b4175fa0db1c7d44e2000ecf97d77ff38f94e92c6"} Apr 20 21:47:45.425488 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:45.425448 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:45.425693 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:45.425587 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:45.425693 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:45.425647 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret podName:d0ce3c41-e846-4b03-82b0-0fae9d903232 nodeName:}" failed. No retries permitted until 2026-04-20 21:48:17.425632652 +0000 UTC m=+66.234319995 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret") pod "global-pull-secret-syncer-twt7t" (UID: "d0ce3c41-e846-4b03-82b0-0fae9d903232") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:45.770093 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:45.770059 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:47:45.770093 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:45.770075 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:47:45.770341 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:45.770059 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:47:45.772960 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:45.772929 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 21:47:45.773061 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:45.772968 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 21:47:45.774242 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:45.774223 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5wzfs\"" Apr 20 21:47:45.774402 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:45.774307 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 21:47:45.774402 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:45.774310 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 21:47:45.774402 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:45.774349 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hbjh6\"" Apr 20 21:47:45.828048 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:45.828029 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:45.828324 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:45.828067 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:45.828324 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:45.828099 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert\") pod \"ingress-canary-7nmnk\" (UID: \"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b\") " pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:47:45.828324 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:45.828162 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:47:45.828324 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:45.828177 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b5f674b89-f7wg6: secret "image-registry-tls" not found Apr 20 21:47:45.828324 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:45.828179 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:47:45.828324 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:45.828190 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:47:45.828324 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:45.828226 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls podName:de9ba6bd-1329-40c2-b819-7dce1bdd20f0 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:47.828212574 +0000 UTC m=+36.636899917 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls") pod "image-registry-7b5f674b89-f7wg6" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0") : secret "image-registry-tls" not found Apr 20 21:47:45.828324 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:45.828241 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert podName:3c4537b6-b9be-4eb9-9b2e-2867dc27db2b nodeName:}" failed. No retries permitted until 2026-04-20 21:47:47.828234265 +0000 UTC m=+36.636921607 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert") pod "ingress-canary-7nmnk" (UID: "3c4537b6-b9be-4eb9-9b2e-2867dc27db2b") : secret "canary-serving-cert" not found Apr 20 21:47:45.828324 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:45.828251 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls podName:edf22b9c-7596-4c82-a080-dfbe98377c19 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:47.828246034 +0000 UTC m=+36.636933377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls") pod "dns-default-cdxfg" (UID: "edf22b9c-7596-4c82-a080-dfbe98377c19") : secret "dns-default-metrics-tls" not found Apr 20 21:47:45.941540 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:45.941515 2566 generic.go:358] "Generic (PLEG): container finished" podID="b6c011cb-eacd-4c0a-88d4-f902e63941c3" containerID="72e3dde4f40bf82dbc59402b4175fa0db1c7d44e2000ecf97d77ff38f94e92c6" exitCode=0 Apr 20 21:47:45.941664 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:45.941573 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9k89" event={"ID":"b6c011cb-eacd-4c0a-88d4-f902e63941c3","Type":"ContainerDied","Data":"72e3dde4f40bf82dbc59402b4175fa0db1c7d44e2000ecf97d77ff38f94e92c6"} Apr 20 21:47:46.948566 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:46.948532 2566 generic.go:358] "Generic (PLEG): container finished" podID="b6c011cb-eacd-4c0a-88d4-f902e63941c3" containerID="5154d79a42a5058499b235b93c15e537d291fc715d0777e0bbc9a678188d0fde" exitCode=0 Apr 20 21:47:46.949019 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:46.948592 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9k89" event={"ID":"b6c011cb-eacd-4c0a-88d4-f902e63941c3","Type":"ContainerDied","Data":"5154d79a42a5058499b235b93c15e537d291fc715d0777e0bbc9a678188d0fde"} Apr 20 21:47:47.844173 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:47.843987 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:47.844372 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:47.844202 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:47.844372 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:47.844131 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:47:47.844372 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:47.844251 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b5f674b89-f7wg6: secret "image-registry-tls" not found Apr 20 21:47:47.844372 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:47.844324 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls podName:de9ba6bd-1329-40c2-b819-7dce1bdd20f0 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:51.844308296 +0000 UTC m=+40.652995639 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls") pod "image-registry-7b5f674b89-f7wg6" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0") : secret "image-registry-tls" not found Apr 20 21:47:47.844372 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:47.844336 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:47:47.844372 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:47.844237 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert\") pod \"ingress-canary-7nmnk\" (UID: \"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b\") " pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:47:47.844372 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:47.844336 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:47:47.844372 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:47.844376 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert podName:3c4537b6-b9be-4eb9-9b2e-2867dc27db2b nodeName:}" failed. No retries permitted until 2026-04-20 21:47:51.844365074 +0000 UTC m=+40.653052417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert") pod "ingress-canary-7nmnk" (UID: "3c4537b6-b9be-4eb9-9b2e-2867dc27db2b") : secret "canary-serving-cert" not found Apr 20 21:47:47.844660 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:47.844403 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls podName:edf22b9c-7596-4c82-a080-dfbe98377c19 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:51.844388666 +0000 UTC m=+40.653076022 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls") pod "dns-default-cdxfg" (UID: "edf22b9c-7596-4c82-a080-dfbe98377c19") : secret "dns-default-metrics-tls" not found Apr 20 21:47:47.953553 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:47.953519 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9k89" event={"ID":"b6c011cb-eacd-4c0a-88d4-f902e63941c3","Type":"ContainerStarted","Data":"34bb68ea19e539f7e4e65d311823628fd22d5b339986f7546d4a6aec00f9823c"} Apr 20 21:47:47.975166 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:47.975096 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-f9k89" podStartSLOduration=5.365077439 podStartE2EDuration="36.975084918s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.039941803 +0000 UTC m=+1.848629146" lastFinishedPulling="2026-04-20 21:47:44.649949278 +0000 UTC m=+33.458636625" observedRunningTime="2026-04-20 21:47:47.973400437 +0000 UTC m=+36.782087802" watchObservedRunningTime="2026-04-20 21:47:47.975084918 +0000 UTC m=+36.783772319" Apr 20 21:47:51.874246 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:51.874214 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert\") pod \"ingress-canary-7nmnk\" (UID: \"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b\") " pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:47:51.874619 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:51.874271 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:51.874619 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:51.874313 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:51.874619 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:51.874392 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:47:51.874619 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:51.874399 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:47:51.874619 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:51.874432 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:47:51.874619 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:51.874447 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b5f674b89-f7wg6: secret "image-registry-tls" not found Apr 20 21:47:51.874619 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:51.874439 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls podName:edf22b9c-7596-4c82-a080-dfbe98377c19 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:59.874426322 +0000 UTC m=+48.683113666 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls") pod "dns-default-cdxfg" (UID: "edf22b9c-7596-4c82-a080-dfbe98377c19") : secret "dns-default-metrics-tls" not found Apr 20 21:47:51.874619 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:51.874499 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert podName:3c4537b6-b9be-4eb9-9b2e-2867dc27db2b nodeName:}" failed. No retries permitted until 2026-04-20 21:47:59.874484405 +0000 UTC m=+48.683171753 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert") pod "ingress-canary-7nmnk" (UID: "3c4537b6-b9be-4eb9-9b2e-2867dc27db2b") : secret "canary-serving-cert" not found Apr 20 21:47:51.874619 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:51.874516 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls podName:de9ba6bd-1329-40c2-b819-7dce1bdd20f0 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:59.874505408 +0000 UTC m=+48.683192755 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls") pod "image-registry-7b5f674b89-f7wg6" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0") : secret "image-registry-tls" not found Apr 20 21:47:59.930596 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:59.930555 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:47:59.931012 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:59.930608 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:47:59.931012 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:47:59.930646 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert\") pod \"ingress-canary-7nmnk\" (UID: \"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b\") " pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:47:59.931012 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:59.930695 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:47:59.931012 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:59.930712 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b5f674b89-f7wg6: secret "image-registry-tls" not found Apr 20 21:47:59.931012 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:59.930744 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:47:59.931012 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:59.930769 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls podName:de9ba6bd-1329-40c2-b819-7dce1bdd20f0 nodeName:}" failed. No retries permitted until 2026-04-20 21:48:15.930752536 +0000 UTC m=+64.739439882 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls") pod "image-registry-7b5f674b89-f7wg6" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0") : secret "image-registry-tls" not found Apr 20 21:47:59.931012 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:59.930774 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:47:59.931012 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:59.930792 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert podName:3c4537b6-b9be-4eb9-9b2e-2867dc27db2b nodeName:}" failed. No retries permitted until 2026-04-20 21:48:15.930775894 +0000 UTC m=+64.739463241 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert") pod "ingress-canary-7nmnk" (UID: "3c4537b6-b9be-4eb9-9b2e-2867dc27db2b") : secret "canary-serving-cert" not found Apr 20 21:47:59.931012 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:47:59.930828 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls podName:edf22b9c-7596-4c82-a080-dfbe98377c19 nodeName:}" failed. No retries permitted until 2026-04-20 21:48:15.930817412 +0000 UTC m=+64.739504765 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls") pod "dns-default-cdxfg" (UID: "edf22b9c-7596-4c82-a080-dfbe98377c19") : secret "dns-default-metrics-tls" not found Apr 20 21:48:07.932909 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:07.932875 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jvkzb" Apr 20 21:48:15.952113 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:15.952079 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:48:15.952468 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:15.952127 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:48:15.952468 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:15.952155 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert\") pod \"ingress-canary-7nmnk\" (UID: \"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b\") " pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:48:15.952468 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:15.952213 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:48:15.952468 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:15.952232 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b5f674b89-f7wg6: secret "image-registry-tls" not found Apr 20 21:48:15.952468 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:15.952234 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:48:15.952468 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:15.952264 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:48:15.952468 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:15.952300 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert podName:3c4537b6-b9be-4eb9-9b2e-2867dc27db2b nodeName:}" failed. No retries permitted until 2026-04-20 21:48:47.952270383 +0000 UTC m=+96.760957731 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert") pod "ingress-canary-7nmnk" (UID: "3c4537b6-b9be-4eb9-9b2e-2867dc27db2b") : secret "canary-serving-cert" not found Apr 20 21:48:15.952468 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:15.952313 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls podName:de9ba6bd-1329-40c2-b819-7dce1bdd20f0 nodeName:}" failed. No retries permitted until 2026-04-20 21:48:47.952307487 +0000 UTC m=+96.760994829 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls") pod "image-registry-7b5f674b89-f7wg6" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0") : secret "image-registry-tls" not found Apr 20 21:48:15.952468 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:15.952331 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls podName:edf22b9c-7596-4c82-a080-dfbe98377c19 nodeName:}" failed. No retries permitted until 2026-04-20 21:48:47.952317395 +0000 UTC m=+96.761004737 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls") pod "dns-default-cdxfg" (UID: "edf22b9c-7596-4c82-a080-dfbe98377c19") : secret "dns-default-metrics-tls" not found Apr 20 21:48:16.456315 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:16.456270 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs\") pod \"network-metrics-daemon-j8c9k\" (UID: \"86a4e942-ea9e-4978-b92e-c96688b972a3\") " pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:48:16.458973 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:16.458954 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 21:48:16.467023 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:16.467006 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 21:48:16.467078 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:16.467068 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs podName:86a4e942-ea9e-4978-b92e-c96688b972a3 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:20.467053022 +0000 UTC m=+129.275740364 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs") pod "network-metrics-daemon-j8c9k" (UID: "86a4e942-ea9e-4978-b92e-c96688b972a3") : secret "metrics-daemon-secret" not found Apr 20 21:48:16.557125 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:16.557097 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54x8q\" (UniqueName: \"kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q\") pod \"network-check-target-pt7fs\" (UID: \"04d33160-cee6-4aaf-ab79-d806da372e92\") " pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:48:16.560289 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:16.560257 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 21:48:16.569735 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:16.569714 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 21:48:16.581018 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:16.580997 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54x8q\" (UniqueName: \"kubernetes.io/projected/04d33160-cee6-4aaf-ab79-d806da372e92-kube-api-access-54x8q\") pod \"network-check-target-pt7fs\" (UID: \"04d33160-cee6-4aaf-ab79-d806da372e92\") " pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:48:16.692418 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:16.692393 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hbjh6\"" Apr 20 21:48:16.700541 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:16.700521 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:48:16.848926 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:16.848896 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pt7fs"] Apr 20 21:48:16.851986 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:48:16.851952 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04d33160_cee6_4aaf_ab79_d806da372e92.slice/crio-57db02fa88e6db42ee404ca35e7da035e42dcfcf6da0945df345e179add30f3a WatchSource:0}: Error finding container 57db02fa88e6db42ee404ca35e7da035e42dcfcf6da0945df345e179add30f3a: Status 404 returned error can't find the container with id 57db02fa88e6db42ee404ca35e7da035e42dcfcf6da0945df345e179add30f3a Apr 20 21:48:17.006482 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:17.006406 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pt7fs" event={"ID":"04d33160-cee6-4aaf-ab79-d806da372e92","Type":"ContainerStarted","Data":"57db02fa88e6db42ee404ca35e7da035e42dcfcf6da0945df345e179add30f3a"} Apr 20 21:48:17.465293 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:17.465234 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:48:17.468580 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:17.468556 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 21:48:17.478260 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:17.478226 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0ce3c41-e846-4b03-82b0-0fae9d903232-original-pull-secret\") pod \"global-pull-secret-syncer-twt7t\" (UID: \"d0ce3c41-e846-4b03-82b0-0fae9d903232\") " pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:48:17.585448 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:17.585413 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-twt7t" Apr 20 21:48:17.706126 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:17.706095 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-twt7t"] Apr 20 21:48:17.709473 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:48:17.709441 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0ce3c41_e846_4b03_82b0_0fae9d903232.slice/crio-dce48d3105d9a1041be97c348840d5034333bb1bd462711a9df820c288c5bdbe WatchSource:0}: Error finding container dce48d3105d9a1041be97c348840d5034333bb1bd462711a9df820c288c5bdbe: Status 404 returned error can't find the container with id dce48d3105d9a1041be97c348840d5034333bb1bd462711a9df820c288c5bdbe Apr 20 21:48:18.009342 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:18.009306 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-twt7t" event={"ID":"d0ce3c41-e846-4b03-82b0-0fae9d903232","Type":"ContainerStarted","Data":"dce48d3105d9a1041be97c348840d5034333bb1bd462711a9df820c288c5bdbe"} Apr 20 21:48:20.014250 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:20.014212 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pt7fs" event={"ID":"04d33160-cee6-4aaf-ab79-d806da372e92","Type":"ContainerStarted","Data":"bae59149ae729891ffaf1683f2c30dbe29645d885215593a89e61bfff09285bd"} Apr 20 21:48:20.014746 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:20.014326 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:48:20.028864 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:20.028817 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-pt7fs" podStartSLOduration=66.186501479 podStartE2EDuration="1m9.02880046s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="2026-04-20 21:48:16.853770632 +0000 UTC m=+65.662457983" lastFinishedPulling="2026-04-20 21:48:19.696069606 +0000 UTC m=+68.504756964" observedRunningTime="2026-04-20 21:48:20.02842938 +0000 UTC m=+68.837116746" watchObservedRunningTime="2026-04-20 21:48:20.02880046 +0000 UTC m=+68.837487826" Apr 20 21:48:22.019043 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:22.019004 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-twt7t" event={"ID":"d0ce3c41-e846-4b03-82b0-0fae9d903232","Type":"ContainerStarted","Data":"84e771ab7007f8b9f8307e45809cfa72f6396ece99d59cb49f8576c4d44236a6"} Apr 20 21:48:22.032953 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:22.032909 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-twt7t" podStartSLOduration=65.393657063 podStartE2EDuration="1m9.032894961s" podCreationTimestamp="2026-04-20 21:47:13 +0000 UTC" firstStartedPulling="2026-04-20 21:48:17.711813426 +0000 UTC m=+66.520500769" lastFinishedPulling="2026-04-20 21:48:21.351051319 +0000 UTC m=+70.159738667" observedRunningTime="2026-04-20 21:48:22.032134032 +0000 UTC m=+70.840821397" watchObservedRunningTime="2026-04-20 21:48:22.032894961 +0000 UTC m=+70.841582325" Apr 20 21:48:47.979539 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:47.979495 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:48:47.979539 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:47.979546 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:48:47.980070 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:47.979639 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:48:47.980070 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:47.979657 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b5f674b89-f7wg6: secret "image-registry-tls" not found Apr 20 21:48:47.980070 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:47.979683 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:48:47.980070 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:47.979716 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls podName:de9ba6bd-1329-40c2-b819-7dce1bdd20f0 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:51.979699057 +0000 UTC m=+160.788386399 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls") pod "image-registry-7b5f674b89-f7wg6" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0") : secret "image-registry-tls" not found Apr 20 21:48:47.980070 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:47.979729 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls podName:edf22b9c-7596-4c82-a080-dfbe98377c19 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:51.979723536 +0000 UTC m=+160.788410879 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls") pod "dns-default-cdxfg" (UID: "edf22b9c-7596-4c82-a080-dfbe98377c19") : secret "dns-default-metrics-tls" not found Apr 20 21:48:47.980070 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:47.979764 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert\") pod \"ingress-canary-7nmnk\" (UID: \"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b\") " pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:48:47.980070 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:47.979835 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:48:47.980070 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:48:47.979857 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert podName:3c4537b6-b9be-4eb9-9b2e-2867dc27db2b nodeName:}" failed. No retries permitted until 2026-04-20 21:49:51.979849376 +0000 UTC m=+160.788536718 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert") pod "ingress-canary-7nmnk" (UID: "3c4537b6-b9be-4eb9-9b2e-2867dc27db2b") : secret "canary-serving-cert" not found Apr 20 21:48:51.018423 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:48:51.018386 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-pt7fs" Apr 20 21:49:20.502491 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:20.502445 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs\") pod \"network-metrics-daemon-j8c9k\" (UID: \"86a4e942-ea9e-4978-b92e-c96688b972a3\") " pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:49:20.503039 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:20.502618 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 21:49:20.503039 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:20.502707 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs podName:86a4e942-ea9e-4978-b92e-c96688b972a3 nodeName:}" failed. No retries permitted until 2026-04-20 21:51:22.502683104 +0000 UTC m=+251.311370453 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs") pod "network-metrics-daemon-j8c9k" (UID: "86a4e942-ea9e-4978-b92e-c96688b972a3") : secret "metrics-daemon-secret" not found Apr 20 21:49:31.168635 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.168604 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5b8b45bf6b-bjzvc"] Apr 20 21:49:31.171703 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.171682 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.174209 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.174191 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 21:49:31.174322 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.174208 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 21:49:31.174498 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.174484 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-bklt2\"" Apr 20 21:49:31.174665 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.174644 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 21:49:31.175502 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.175488 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 21:49:31.175578 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.175508 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 21:49:31.175578 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.175528 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 21:49:31.184118 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.184098 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5b8b45bf6b-bjzvc"] Apr 20 21:49:31.275325 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.275296 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-stats-auth\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.275325 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.275324 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.275499 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.275368 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.275499 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.275406 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-default-certificate\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.275499 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.275426 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qcwv\" (UniqueName: \"kubernetes.io/projected/dc92671e-7696-4acb-8e57-f2e271dec9f2-kube-api-access-4qcwv\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.375955 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.375911 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.375955 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.375962 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-default-certificate\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.376179 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.375991 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qcwv\" (UniqueName: \"kubernetes.io/projected/dc92671e-7696-4acb-8e57-f2e271dec9f2-kube-api-access-4qcwv\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.376179 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.376052 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-stats-auth\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.376179 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.376076 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.376179 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:31.376107 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle podName:dc92671e-7696-4acb-8e57-f2e271dec9f2 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:31.876081631 +0000 UTC m=+140.684768991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle") pod "router-default-5b8b45bf6b-bjzvc" (UID: "dc92671e-7696-4acb-8e57-f2e271dec9f2") : configmap references non-existent config key: service-ca.crt Apr 20 21:49:31.376179 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:31.376154 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 21:49:31.376410 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:31.376223 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs podName:dc92671e-7696-4acb-8e57-f2e271dec9f2 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:31.876205578 +0000 UTC m=+140.684892923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs") pod "router-default-5b8b45bf6b-bjzvc" (UID: "dc92671e-7696-4acb-8e57-f2e271dec9f2") : secret "router-metrics-certs-default" not found Apr 20 21:49:31.378391 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.378372 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-default-certificate\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.378452 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.378404 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-stats-auth\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.384637 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.384610 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qcwv\" (UniqueName: \"kubernetes.io/projected/dc92671e-7696-4acb-8e57-f2e271dec9f2-kube-api-access-4qcwv\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.880412 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.880373 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.880580 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:31.880469 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:31.880580 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:31.880565 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle podName:dc92671e-7696-4acb-8e57-f2e271dec9f2 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:32.880547765 +0000 UTC m=+141.689235112 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle") pod "router-default-5b8b45bf6b-bjzvc" (UID: "dc92671e-7696-4acb-8e57-f2e271dec9f2") : configmap references non-existent config key: service-ca.crt Apr 20 21:49:31.880659 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:31.880614 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 21:49:31.880691 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:31.880682 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs podName:dc92671e-7696-4acb-8e57-f2e271dec9f2 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:32.880669124 +0000 UTC m=+141.689356467 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs") pod "router-default-5b8b45bf6b-bjzvc" (UID: "dc92671e-7696-4acb-8e57-f2e271dec9f2") : secret "router-metrics-certs-default" not found Apr 20 21:49:32.887529 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:32.887496 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:32.887939 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:32.887577 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:32.887939 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:32.887671 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle podName:dc92671e-7696-4acb-8e57-f2e271dec9f2 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:34.887657757 +0000 UTC m=+143.696345099 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle") pod "router-default-5b8b45bf6b-bjzvc" (UID: "dc92671e-7696-4acb-8e57-f2e271dec9f2") : configmap references non-existent config key: service-ca.crt Apr 20 21:49:32.887939 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:32.887712 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 21:49:32.887939 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:32.887822 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs podName:dc92671e-7696-4acb-8e57-f2e271dec9f2 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:34.887803005 +0000 UTC m=+143.696490351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs") pod "router-default-5b8b45bf6b-bjzvc" (UID: "dc92671e-7696-4acb-8e57-f2e271dec9f2") : secret "router-metrics-certs-default" not found Apr 20 21:49:34.901416 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:34.901388 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:34.901808 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:34.901456 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:34.901808 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:34.901550 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle podName:dc92671e-7696-4acb-8e57-f2e271dec9f2 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:38.901532892 +0000 UTC m=+147.710220247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle") pod "router-default-5b8b45bf6b-bjzvc" (UID: "dc92671e-7696-4acb-8e57-f2e271dec9f2") : configmap references non-existent config key: service-ca.crt Apr 20 21:49:34.901808 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:34.901571 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 21:49:34.901808 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:34.901606 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs podName:dc92671e-7696-4acb-8e57-f2e271dec9f2 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:38.901595028 +0000 UTC m=+147.710282382 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs") pod "router-default-5b8b45bf6b-bjzvc" (UID: "dc92671e-7696-4acb-8e57-f2e271dec9f2") : secret "router-metrics-certs-default" not found Apr 20 21:49:36.981965 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:36.981934 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-b8cgm"] Apr 20 21:49:36.984871 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:36.984855 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b8cgm" Apr 20 21:49:36.987293 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:36.987264 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 21:49:36.987404 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:36.987265 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 21:49:36.988453 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:36.988438 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-ktkpm\"" Apr 20 21:49:36.992121 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:36.992097 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-b8cgm"] Apr 20 21:49:37.119560 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:37.119516 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8q89\" (UniqueName: \"kubernetes.io/projected/d17e1fe4-c7f8-4b47-ae22-d0107c62522f-kube-api-access-n8q89\") pod \"migrator-74bb7799d9-b8cgm\" (UID: \"d17e1fe4-c7f8-4b47-ae22-d0107c62522f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b8cgm" Apr 20 21:49:37.220495 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:37.220461 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8q89\" (UniqueName: \"kubernetes.io/projected/d17e1fe4-c7f8-4b47-ae22-d0107c62522f-kube-api-access-n8q89\") pod \"migrator-74bb7799d9-b8cgm\" (UID: \"d17e1fe4-c7f8-4b47-ae22-d0107c62522f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b8cgm" Apr 20 21:49:37.228286 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:37.228266 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8q89\" (UniqueName: \"kubernetes.io/projected/d17e1fe4-c7f8-4b47-ae22-d0107c62522f-kube-api-access-n8q89\") pod \"migrator-74bb7799d9-b8cgm\" (UID: \"d17e1fe4-c7f8-4b47-ae22-d0107c62522f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b8cgm" Apr 20 21:49:37.294431 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:37.294363 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b8cgm" Apr 20 21:49:37.405099 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:37.405063 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-b8cgm"] Apr 20 21:49:37.407843 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:49:37.407821 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd17e1fe4_c7f8_4b47_ae22_d0107c62522f.slice/crio-6bb9b57428cd2629039200d8a9f7191def23f7d3cb8a1952488dd8976403c602 WatchSource:0}: Error finding container 6bb9b57428cd2629039200d8a9f7191def23f7d3cb8a1952488dd8976403c602: Status 404 returned error can't find the container with id 6bb9b57428cd2629039200d8a9f7191def23f7d3cb8a1952488dd8976403c602 Apr 20 21:49:38.158647 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:38.158614 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b8cgm" event={"ID":"d17e1fe4-c7f8-4b47-ae22-d0107c62522f","Type":"ContainerStarted","Data":"6bb9b57428cd2629039200d8a9f7191def23f7d3cb8a1952488dd8976403c602"} Apr 20 21:49:38.533040 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:38.533014 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-t72lj_7201191d-577f-470c-81dc-ec7f86680c09/dns-node-resolver/0.log" Apr 20 21:49:38.934173 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:38.934092 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:38.934341 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:38.934174 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:38.934341 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:38.934227 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 21:49:38.934341 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:38.934306 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs podName:dc92671e-7696-4acb-8e57-f2e271dec9f2 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:46.93427469 +0000 UTC m=+155.742962033 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs") pod "router-default-5b8b45bf6b-bjzvc" (UID: "dc92671e-7696-4acb-8e57-f2e271dec9f2") : secret "router-metrics-certs-default" not found Apr 20 21:49:38.934341 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:38.934322 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle podName:dc92671e-7696-4acb-8e57-f2e271dec9f2 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:46.934316003 +0000 UTC m=+155.743003345 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle") pod "router-default-5b8b45bf6b-bjzvc" (UID: "dc92671e-7696-4acb-8e57-f2e271dec9f2") : configmap references non-existent config key: service-ca.crt Apr 20 21:49:39.133043 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:39.133016 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8b2mt_ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468/node-ca/0.log" Apr 20 21:49:39.161982 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:39.161952 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b8cgm" event={"ID":"d17e1fe4-c7f8-4b47-ae22-d0107c62522f","Type":"ContainerStarted","Data":"d5c2f4a7db612d85c3c6455032c4949ab33995450f2b17f4538fc6303e1c0e7f"} Apr 20 21:49:39.161982 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:39.161983 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b8cgm" event={"ID":"d17e1fe4-c7f8-4b47-ae22-d0107c62522f","Type":"ContainerStarted","Data":"745373a4afce0f553ee3e2b8439c967c4ef37cd5902f6068e34208a539ae7549"} Apr 20 21:49:39.177663 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:39.177620 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b8cgm" podStartSLOduration=1.9410606910000001 podStartE2EDuration="3.177607168s" podCreationTimestamp="2026-04-20 21:49:36 +0000 UTC" firstStartedPulling="2026-04-20 21:49:37.409549361 +0000 UTC m=+146.218236704" lastFinishedPulling="2026-04-20 21:49:38.646095833 +0000 UTC m=+147.454783181" observedRunningTime="2026-04-20 21:49:39.176889504 +0000 UTC m=+147.985576868" watchObservedRunningTime="2026-04-20 21:49:39.177607168 +0000 UTC m=+147.986294576" Apr 20 21:49:46.993215 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:46.993173 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:46.993698 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:46.993247 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:46.993698 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:46.993381 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle podName:dc92671e-7696-4acb-8e57-f2e271dec9f2 nodeName:}" failed. No retries permitted until 2026-04-20 21:50:02.993363566 +0000 UTC m=+171.802050909 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle") pod "router-default-5b8b45bf6b-bjzvc" (UID: "dc92671e-7696-4acb-8e57-f2e271dec9f2") : configmap references non-existent config key: service-ca.crt Apr 20 21:49:46.995596 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:46.995571 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc92671e-7696-4acb-8e57-f2e271dec9f2-metrics-certs\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:49:47.104149 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:47.104097 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" podUID="de9ba6bd-1329-40c2-b819-7dce1bdd20f0" Apr 20 21:49:47.116258 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:47.116229 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-cdxfg" podUID="edf22b9c-7596-4c82-a080-dfbe98377c19" Apr 20 21:49:47.134607 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:47.134570 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-7nmnk" podUID="3c4537b6-b9be-4eb9-9b2e-2867dc27db2b" Apr 20 21:49:47.177014 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:47.176988 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:49:47.177151 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:47.176988 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:49:47.177222 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:47.176998 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cdxfg" Apr 20 21:49:48.779965 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:49:48.779925 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-j8c9k" podUID="86a4e942-ea9e-4978-b92e-c96688b972a3" Apr 20 21:49:52.031158 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:52.031128 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:49:52.031542 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:52.031167 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:49:52.031542 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:52.031212 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert\") pod \"ingress-canary-7nmnk\" (UID: \"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b\") " pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:49:52.033608 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:52.033586 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls\") pod \"image-registry-7b5f674b89-f7wg6\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:49:52.033687 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:52.033589 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edf22b9c-7596-4c82-a080-dfbe98377c19-metrics-tls\") pod \"dns-default-cdxfg\" (UID: \"edf22b9c-7596-4c82-a080-dfbe98377c19\") " pod="openshift-dns/dns-default-cdxfg" Apr 20 21:49:52.033687 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:52.033637 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c4537b6-b9be-4eb9-9b2e-2867dc27db2b-cert\") pod \"ingress-canary-7nmnk\" (UID: \"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b\") " pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:49:52.281348 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:52.281244 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-s4j5s\"" Apr 20 21:49:52.281348 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:52.281244 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rxx7z\"" Apr 20 21:49:52.281348 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:52.281303 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bqmdj\"" Apr 20 21:49:52.288568 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:52.288540 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cdxfg" Apr 20 21:49:52.288568 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:52.288557 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:49:52.288759 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:52.288556 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7nmnk" Apr 20 21:49:52.419941 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:52.419913 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7b5f674b89-f7wg6"] Apr 20 21:49:52.423537 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:49:52.423493 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde9ba6bd_1329_40c2_b819_7dce1bdd20f0.slice/crio-68936f7021935f1ebb8e6995eed81aecb0af54db35c0d029ae2a4bad23d36087 WatchSource:0}: Error finding container 68936f7021935f1ebb8e6995eed81aecb0af54db35c0d029ae2a4bad23d36087: Status 404 returned error can't find the container with id 68936f7021935f1ebb8e6995eed81aecb0af54db35c0d029ae2a4bad23d36087 Apr 20 21:49:52.639543 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:52.639467 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7nmnk"] Apr 20 21:49:52.643248 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:49:52.643212 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c4537b6_b9be_4eb9_9b2e_2867dc27db2b.slice/crio-41bc577de02612300f05da8b80585f57e662c452cc02292cc1359f14f0d40201 WatchSource:0}: Error finding container 41bc577de02612300f05da8b80585f57e662c452cc02292cc1359f14f0d40201: Status 404 returned error can't find the container with id 41bc577de02612300f05da8b80585f57e662c452cc02292cc1359f14f0d40201 Apr 20 21:49:52.645157 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:52.645137 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cdxfg"] Apr 20 21:49:52.647561 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:49:52.647541 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedf22b9c_7596_4c82_a080_dfbe98377c19.slice/crio-c3443d8290ef62cf3c56e6018d3dd09997134fa57868e2401c36cf4707f306d0 WatchSource:0}: Error finding container c3443d8290ef62cf3c56e6018d3dd09997134fa57868e2401c36cf4707f306d0: Status 404 returned error can't find the container with id c3443d8290ef62cf3c56e6018d3dd09997134fa57868e2401c36cf4707f306d0 Apr 20 21:49:53.190947 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:53.190909 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" event={"ID":"de9ba6bd-1329-40c2-b819-7dce1bdd20f0","Type":"ContainerStarted","Data":"db69d042c69cdd5d24e410a1ab1e2d39cbb647689ec47ee4a73639c7e76f3a6a"} Apr 20 21:49:53.190947 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:53.190953 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" event={"ID":"de9ba6bd-1329-40c2-b819-7dce1bdd20f0","Type":"ContainerStarted","Data":"68936f7021935f1ebb8e6995eed81aecb0af54db35c0d029ae2a4bad23d36087"} Apr 20 21:49:53.191543 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:53.191033 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:49:53.192713 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:53.192673 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cdxfg" event={"ID":"edf22b9c-7596-4c82-a080-dfbe98377c19","Type":"ContainerStarted","Data":"c3443d8290ef62cf3c56e6018d3dd09997134fa57868e2401c36cf4707f306d0"} Apr 20 21:49:53.193819 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:53.193796 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7nmnk" event={"ID":"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b","Type":"ContainerStarted","Data":"41bc577de02612300f05da8b80585f57e662c452cc02292cc1359f14f0d40201"} Apr 20 21:49:53.212810 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:53.212762 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" podStartSLOduration=156.212749916 podStartE2EDuration="2m36.212749916s" podCreationTimestamp="2026-04-20 21:47:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:49:53.210781395 +0000 UTC m=+162.019468759" watchObservedRunningTime="2026-04-20 21:49:53.212749916 +0000 UTC m=+162.021437280" Apr 20 21:49:55.199863 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:55.199819 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cdxfg" event={"ID":"edf22b9c-7596-4c82-a080-dfbe98377c19","Type":"ContainerStarted","Data":"e94e81289c10325ca0a34d62176fec1e40b7094741ce0051fe9946a90ec0742d"} Apr 20 21:49:55.199863 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:55.199858 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cdxfg" event={"ID":"edf22b9c-7596-4c82-a080-dfbe98377c19","Type":"ContainerStarted","Data":"bba5c893de0b8c91f37c1f3c655726ee273b51878b96abf04fbf4c1d3a76b12c"} Apr 20 21:49:55.200409 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:55.199995 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cdxfg" Apr 20 21:49:55.201164 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:55.201138 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7nmnk" event={"ID":"3c4537b6-b9be-4eb9-9b2e-2867dc27db2b","Type":"ContainerStarted","Data":"8bff4407f353b8846560250c612c181ab0577ca98bb13c1aab65722264f0f9e3"} Apr 20 21:49:55.216338 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:55.216297 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cdxfg" podStartSLOduration=129.477231579 podStartE2EDuration="2m11.216270464s" podCreationTimestamp="2026-04-20 21:47:44 +0000 UTC" firstStartedPulling="2026-04-20 21:49:52.649315155 +0000 UTC m=+161.458002503" lastFinishedPulling="2026-04-20 21:49:54.388354032 +0000 UTC m=+163.197041388" observedRunningTime="2026-04-20 21:49:55.215080704 +0000 UTC m=+164.023768069" watchObservedRunningTime="2026-04-20 21:49:55.216270464 +0000 UTC m=+164.024957828" Apr 20 21:49:55.228757 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:55.228719 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7nmnk" podStartSLOduration=129.483543623 podStartE2EDuration="2m11.22870681s" podCreationTimestamp="2026-04-20 21:47:44 +0000 UTC" firstStartedPulling="2026-04-20 21:49:52.645652749 +0000 UTC m=+161.454340092" lastFinishedPulling="2026-04-20 21:49:54.390815936 +0000 UTC m=+163.199503279" observedRunningTime="2026-04-20 21:49:55.227777029 +0000 UTC m=+164.036464394" watchObservedRunningTime="2026-04-20 21:49:55.22870681 +0000 UTC m=+164.037394198" Apr 20 21:49:59.753767 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.753732 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-knc69"] Apr 20 21:49:59.756602 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.756587 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.760373 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.760350 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 21:49:59.760750 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.760731 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 21:49:59.760860 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.760752 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 21:49:59.760923 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.760899 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 21:49:59.761112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.761095 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-txw8c\"" Apr 20 21:49:59.774265 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.774245 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-knc69"] Apr 20 21:49:59.780050 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.780028 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7b5f674b89-f7wg6"] Apr 20 21:49:59.783247 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.783225 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-wz52s"] Apr 20 21:49:59.786027 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.786010 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-wz52s" Apr 20 21:49:59.787292 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.787263 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/06adda10-864e-4b54-b6d8-4020aa460197-crio-socket\") pod \"insights-runtime-extractor-knc69\" (UID: \"06adda10-864e-4b54-b6d8-4020aa460197\") " pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.787377 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.787347 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/06adda10-864e-4b54-b6d8-4020aa460197-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-knc69\" (UID: \"06adda10-864e-4b54-b6d8-4020aa460197\") " pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.787424 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.787387 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/06adda10-864e-4b54-b6d8-4020aa460197-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-knc69\" (UID: \"06adda10-864e-4b54-b6d8-4020aa460197\") " pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.787424 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.787409 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/06adda10-864e-4b54-b6d8-4020aa460197-data-volume\") pod \"insights-runtime-extractor-knc69\" (UID: \"06adda10-864e-4b54-b6d8-4020aa460197\") " pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.787499 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.787425 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6lrw\" (UniqueName: \"kubernetes.io/projected/06adda10-864e-4b54-b6d8-4020aa460197-kube-api-access-j6lrw\") pod \"insights-runtime-extractor-knc69\" (UID: \"06adda10-864e-4b54-b6d8-4020aa460197\") " pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.789105 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.788556 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-zd9tn\"" Apr 20 21:49:59.789105 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.788849 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 21:49:59.789257 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.789223 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 21:49:59.795574 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.795553 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-wz52s"] Apr 20 21:49:59.888509 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.888475 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/06adda10-864e-4b54-b6d8-4020aa460197-data-volume\") pod \"insights-runtime-extractor-knc69\" (UID: \"06adda10-864e-4b54-b6d8-4020aa460197\") " pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.888509 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.888506 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6lrw\" (UniqueName: \"kubernetes.io/projected/06adda10-864e-4b54-b6d8-4020aa460197-kube-api-access-j6lrw\") pod \"insights-runtime-extractor-knc69\" (UID: \"06adda10-864e-4b54-b6d8-4020aa460197\") " pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.888760 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.888531 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcpsk\" (UniqueName: \"kubernetes.io/projected/c6a9fd05-b4c9-4635-889f-49259cf8782a-kube-api-access-rcpsk\") pod \"downloads-6bcc868b7-wz52s\" (UID: \"c6a9fd05-b4c9-4635-889f-49259cf8782a\") " pod="openshift-console/downloads-6bcc868b7-wz52s" Apr 20 21:49:59.888760 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.888617 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/06adda10-864e-4b54-b6d8-4020aa460197-crio-socket\") pod \"insights-runtime-extractor-knc69\" (UID: \"06adda10-864e-4b54-b6d8-4020aa460197\") " pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.888760 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.888639 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/06adda10-864e-4b54-b6d8-4020aa460197-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-knc69\" (UID: \"06adda10-864e-4b54-b6d8-4020aa460197\") " pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.888760 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.888668 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/06adda10-864e-4b54-b6d8-4020aa460197-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-knc69\" (UID: \"06adda10-864e-4b54-b6d8-4020aa460197\") " pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.888982 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.888752 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/06adda10-864e-4b54-b6d8-4020aa460197-crio-socket\") pod \"insights-runtime-extractor-knc69\" (UID: \"06adda10-864e-4b54-b6d8-4020aa460197\") " pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.888982 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.888836 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/06adda10-864e-4b54-b6d8-4020aa460197-data-volume\") pod \"insights-runtime-extractor-knc69\" (UID: \"06adda10-864e-4b54-b6d8-4020aa460197\") " pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.889150 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.889128 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/06adda10-864e-4b54-b6d8-4020aa460197-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-knc69\" (UID: \"06adda10-864e-4b54-b6d8-4020aa460197\") " pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.891176 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.891157 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/06adda10-864e-4b54-b6d8-4020aa460197-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-knc69\" (UID: \"06adda10-864e-4b54-b6d8-4020aa460197\") " pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.902975 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.902953 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6lrw\" (UniqueName: \"kubernetes.io/projected/06adda10-864e-4b54-b6d8-4020aa460197-kube-api-access-j6lrw\") pod \"insights-runtime-extractor-knc69\" (UID: \"06adda10-864e-4b54-b6d8-4020aa460197\") " pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:49:59.989256 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.989216 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcpsk\" (UniqueName: \"kubernetes.io/projected/c6a9fd05-b4c9-4635-889f-49259cf8782a-kube-api-access-rcpsk\") pod \"downloads-6bcc868b7-wz52s\" (UID: \"c6a9fd05-b4c9-4635-889f-49259cf8782a\") " pod="openshift-console/downloads-6bcc868b7-wz52s" Apr 20 21:49:59.997361 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:49:59.997334 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcpsk\" (UniqueName: \"kubernetes.io/projected/c6a9fd05-b4c9-4635-889f-49259cf8782a-kube-api-access-rcpsk\") pod \"downloads-6bcc868b7-wz52s\" (UID: \"c6a9fd05-b4c9-4635-889f-49259cf8782a\") " pod="openshift-console/downloads-6bcc868b7-wz52s" Apr 20 21:50:00.065488 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:00.065392 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-knc69" Apr 20 21:50:00.096818 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:00.096766 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-wz52s" Apr 20 21:50:00.193395 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:00.193368 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-knc69"] Apr 20 21:50:00.195654 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:50:00.195630 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06adda10_864e_4b54_b6d8_4020aa460197.slice/crio-e436355f3b62007aa4050290ef7674d222a46e56fe4c52cfb0a3d4c539f4b17a WatchSource:0}: Error finding container e436355f3b62007aa4050290ef7674d222a46e56fe4c52cfb0a3d4c539f4b17a: Status 404 returned error can't find the container with id e436355f3b62007aa4050290ef7674d222a46e56fe4c52cfb0a3d4c539f4b17a Apr 20 21:50:00.214361 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:00.214323 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-knc69" event={"ID":"06adda10-864e-4b54-b6d8-4020aa460197","Type":"ContainerStarted","Data":"e436355f3b62007aa4050290ef7674d222a46e56fe4c52cfb0a3d4c539f4b17a"} Apr 20 21:50:00.232721 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:00.232696 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-wz52s"] Apr 20 21:50:00.235475 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:50:00.235447 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6a9fd05_b4c9_4635_889f_49259cf8782a.slice/crio-b128cf240d26fa27913f93af51ea38af3e368dcf2dcb7445353c55a3e9dba054 WatchSource:0}: Error finding container b128cf240d26fa27913f93af51ea38af3e368dcf2dcb7445353c55a3e9dba054: Status 404 returned error can't find the container with id b128cf240d26fa27913f93af51ea38af3e368dcf2dcb7445353c55a3e9dba054 Apr 20 21:50:00.769326 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:00.769297 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:50:01.218551 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:01.218520 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-knc69" event={"ID":"06adda10-864e-4b54-b6d8-4020aa460197","Type":"ContainerStarted","Data":"e5b4eedc8171489221def0e3bce2f7948c515c5d2815689bb0d9cc11b11b6f10"} Apr 20 21:50:01.218713 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:01.218556 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-knc69" event={"ID":"06adda10-864e-4b54-b6d8-4020aa460197","Type":"ContainerStarted","Data":"363128ec5c255dfcbf567ef99a019acb363f58c46ec2ab0886d3f93d10e6d6bf"} Apr 20 21:50:01.219419 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:01.219397 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-wz52s" event={"ID":"c6a9fd05-b4c9-4635-889f-49259cf8782a","Type":"ContainerStarted","Data":"b128cf240d26fa27913f93af51ea38af3e368dcf2dcb7445353c55a3e9dba054"} Apr 20 21:50:03.009097 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:03.009061 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:50:03.009731 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:03.009708 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc92671e-7696-4acb-8e57-f2e271dec9f2-service-ca-bundle\") pod \"router-default-5b8b45bf6b-bjzvc\" (UID: \"dc92671e-7696-4acb-8e57-f2e271dec9f2\") " pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:50:03.227471 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:03.227432 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-knc69" event={"ID":"06adda10-864e-4b54-b6d8-4020aa460197","Type":"ContainerStarted","Data":"f496193e47c81fcc26dc0ea07ec92303a7db6da61cf1484d72e0821aefe14812"} Apr 20 21:50:03.244290 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:03.244214 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-knc69" podStartSLOduration=2.273713447 podStartE2EDuration="4.244196944s" podCreationTimestamp="2026-04-20 21:49:59 +0000 UTC" firstStartedPulling="2026-04-20 21:50:00.248203223 +0000 UTC m=+169.056890571" lastFinishedPulling="2026-04-20 21:50:02.218686711 +0000 UTC m=+171.027374068" observedRunningTime="2026-04-20 21:50:03.24319337 +0000 UTC m=+172.051880734" watchObservedRunningTime="2026-04-20 21:50:03.244196944 +0000 UTC m=+172.052884310" Apr 20 21:50:03.281422 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:03.281348 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:50:03.407663 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:03.407628 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5b8b45bf6b-bjzvc"] Apr 20 21:50:03.411297 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:50:03.411237 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc92671e_7696_4acb_8e57_f2e271dec9f2.slice/crio-83d4624492e134698b989cfc754fd4af65a5410af73804bea92afcb1ccbdc194 WatchSource:0}: Error finding container 83d4624492e134698b989cfc754fd4af65a5410af73804bea92afcb1ccbdc194: Status 404 returned error can't find the container with id 83d4624492e134698b989cfc754fd4af65a5410af73804bea92afcb1ccbdc194 Apr 20 21:50:04.231317 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:04.231258 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" event={"ID":"dc92671e-7696-4acb-8e57-f2e271dec9f2","Type":"ContainerStarted","Data":"865409dbad245c1a4624f9b068f0f28e15275fd6d6aa168b16c5e31d37fb0ce7"} Apr 20 21:50:04.231317 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:04.231319 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" event={"ID":"dc92671e-7696-4acb-8e57-f2e271dec9f2","Type":"ContainerStarted","Data":"83d4624492e134698b989cfc754fd4af65a5410af73804bea92afcb1ccbdc194"} Apr 20 21:50:04.250948 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:04.250888 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" podStartSLOduration=33.2508722 podStartE2EDuration="33.2508722s" podCreationTimestamp="2026-04-20 21:49:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:50:04.248848927 +0000 UTC m=+173.057536304" watchObservedRunningTime="2026-04-20 21:50:04.2508722 +0000 UTC m=+173.059559564" Apr 20 21:50:04.281649 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:04.281617 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:50:04.284315 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:04.284276 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:50:05.206258 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.206227 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cdxfg" Apr 20 21:50:05.233961 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.233922 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:50:05.235212 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.235184 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5b8b45bf6b-bjzvc" Apr 20 21:50:05.659684 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.659653 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66854c8fcc-xws46"] Apr 20 21:50:05.662895 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.662875 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.665572 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.665549 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 21:50:05.665572 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.665567 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 21:50:05.666711 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.666661 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 21:50:05.666711 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.666703 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-sdnpc\"" Apr 20 21:50:05.666866 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.666726 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 21:50:05.666866 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.666748 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 21:50:05.672510 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.672473 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66854c8fcc-xws46"] Apr 20 21:50:05.734598 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.734574 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrlh2\" (UniqueName: \"kubernetes.io/projected/7bef215f-5a33-4576-950c-039084f6b70e-kube-api-access-qrlh2\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.734759 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.734671 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-console-config\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.734759 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.734749 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bef215f-5a33-4576-950c-039084f6b70e-console-oauth-config\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.734883 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.734791 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-service-ca\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.734883 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.734812 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bef215f-5a33-4576-950c-039084f6b70e-console-serving-cert\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.734883 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.734869 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-oauth-serving-cert\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.835365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.835329 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrlh2\" (UniqueName: \"kubernetes.io/projected/7bef215f-5a33-4576-950c-039084f6b70e-kube-api-access-qrlh2\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.835535 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.835388 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-console-config\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.835535 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.835435 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bef215f-5a33-4576-950c-039084f6b70e-console-oauth-config\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.835535 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.835477 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-service-ca\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.835535 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.835503 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bef215f-5a33-4576-950c-039084f6b70e-console-serving-cert\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.835535 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.835532 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-oauth-serving-cert\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.836316 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.836276 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-console-config\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.836979 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.836953 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-oauth-serving-cert\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.836979 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.836978 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-service-ca\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.838083 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.838060 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bef215f-5a33-4576-950c-039084f6b70e-console-oauth-config\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.838466 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.838440 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bef215f-5a33-4576-950c-039084f6b70e-console-serving-cert\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.846816 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.846791 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrlh2\" (UniqueName: \"kubernetes.io/projected/7bef215f-5a33-4576-950c-039084f6b70e-kube-api-access-qrlh2\") pod \"console-66854c8fcc-xws46\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:05.973791 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:05.973758 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:06.099189 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:06.099158 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66854c8fcc-xws46"] Apr 20 21:50:06.102518 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:50:06.102488 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bef215f_5a33_4576_950c_039084f6b70e.slice/crio-4316b010387dec3cece185a5700923ec970d6249681963ab9756c550549f96be WatchSource:0}: Error finding container 4316b010387dec3cece185a5700923ec970d6249681963ab9756c550549f96be: Status 404 returned error can't find the container with id 4316b010387dec3cece185a5700923ec970d6249681963ab9756c550549f96be Apr 20 21:50:06.237404 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:06.237321 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66854c8fcc-xws46" event={"ID":"7bef215f-5a33-4576-950c-039084f6b70e","Type":"ContainerStarted","Data":"4316b010387dec3cece185a5700923ec970d6249681963ab9756c550549f96be"} Apr 20 21:50:08.631995 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.631928 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-42clz"] Apr 20 21:50:08.639077 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.639050 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" Apr 20 21:50:08.641739 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.641715 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 21:50:08.641842 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.641762 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 21:50:08.641908 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.641722 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 21:50:08.644295 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.643542 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 21:50:08.644295 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.643950 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-cb4zb\"" Apr 20 21:50:08.645858 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.645436 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 21:50:08.648233 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.647778 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-42clz"] Apr 20 21:50:08.761074 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.761034 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbkhh\" (UniqueName: \"kubernetes.io/projected/299181ce-50b6-4ee4-bf17-96747e88ebdf-kube-api-access-cbkhh\") pod \"prometheus-operator-5676c8c784-42clz\" (UID: \"299181ce-50b6-4ee4-bf17-96747e88ebdf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" Apr 20 21:50:08.761234 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.761088 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/299181ce-50b6-4ee4-bf17-96747e88ebdf-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-42clz\" (UID: \"299181ce-50b6-4ee4-bf17-96747e88ebdf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" Apr 20 21:50:08.761234 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.761165 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/299181ce-50b6-4ee4-bf17-96747e88ebdf-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-42clz\" (UID: \"299181ce-50b6-4ee4-bf17-96747e88ebdf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" Apr 20 21:50:08.761389 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.761258 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/299181ce-50b6-4ee4-bf17-96747e88ebdf-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-42clz\" (UID: \"299181ce-50b6-4ee4-bf17-96747e88ebdf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" Apr 20 21:50:08.861928 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.861895 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/299181ce-50b6-4ee4-bf17-96747e88ebdf-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-42clz\" (UID: \"299181ce-50b6-4ee4-bf17-96747e88ebdf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" Apr 20 21:50:08.862089 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.861985 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbkhh\" (UniqueName: \"kubernetes.io/projected/299181ce-50b6-4ee4-bf17-96747e88ebdf-kube-api-access-cbkhh\") pod \"prometheus-operator-5676c8c784-42clz\" (UID: \"299181ce-50b6-4ee4-bf17-96747e88ebdf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" Apr 20 21:50:08.862089 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.862019 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/299181ce-50b6-4ee4-bf17-96747e88ebdf-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-42clz\" (UID: \"299181ce-50b6-4ee4-bf17-96747e88ebdf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" Apr 20 21:50:08.862089 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.862048 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/299181ce-50b6-4ee4-bf17-96747e88ebdf-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-42clz\" (UID: \"299181ce-50b6-4ee4-bf17-96747e88ebdf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" Apr 20 21:50:08.862700 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.862643 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/299181ce-50b6-4ee4-bf17-96747e88ebdf-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-42clz\" (UID: \"299181ce-50b6-4ee4-bf17-96747e88ebdf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" Apr 20 21:50:08.864907 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.864858 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/299181ce-50b6-4ee4-bf17-96747e88ebdf-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-42clz\" (UID: \"299181ce-50b6-4ee4-bf17-96747e88ebdf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" Apr 20 21:50:08.865027 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.865005 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/299181ce-50b6-4ee4-bf17-96747e88ebdf-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-42clz\" (UID: \"299181ce-50b6-4ee4-bf17-96747e88ebdf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" Apr 20 21:50:08.870932 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.870905 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbkhh\" (UniqueName: \"kubernetes.io/projected/299181ce-50b6-4ee4-bf17-96747e88ebdf-kube-api-access-cbkhh\") pod \"prometheus-operator-5676c8c784-42clz\" (UID: \"299181ce-50b6-4ee4-bf17-96747e88ebdf\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" Apr 20 21:50:08.952867 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:08.952793 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" Apr 20 21:50:09.120053 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:09.120029 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-42clz"] Apr 20 21:50:09.123618 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:50:09.123581 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod299181ce_50b6_4ee4_bf17_96747e88ebdf.slice/crio-983963dfee4a78affc3d02cf19478f478183d44261ded6a904d32af240a60ec9 WatchSource:0}: Error finding container 983963dfee4a78affc3d02cf19478f478183d44261ded6a904d32af240a60ec9: Status 404 returned error can't find the container with id 983963dfee4a78affc3d02cf19478f478183d44261ded6a904d32af240a60ec9 Apr 20 21:50:09.248161 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:09.248128 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66854c8fcc-xws46" event={"ID":"7bef215f-5a33-4576-950c-039084f6b70e","Type":"ContainerStarted","Data":"1c6f53a3a6b98aa0114b1915569b8428c9d811982b66bfe08fd439ec24a5f6ef"} Apr 20 21:50:09.249478 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:09.249451 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" event={"ID":"299181ce-50b6-4ee4-bf17-96747e88ebdf","Type":"ContainerStarted","Data":"983963dfee4a78affc3d02cf19478f478183d44261ded6a904d32af240a60ec9"} Apr 20 21:50:09.265937 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:09.265889 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66854c8fcc-xws46" podStartSLOduration=1.317765214 podStartE2EDuration="4.265873375s" podCreationTimestamp="2026-04-20 21:50:05 +0000 UTC" firstStartedPulling="2026-04-20 21:50:06.104373609 +0000 UTC m=+174.913060951" lastFinishedPulling="2026-04-20 21:50:09.052481769 +0000 UTC m=+177.861169112" observedRunningTime="2026-04-20 21:50:09.265024448 +0000 UTC m=+178.073711818" watchObservedRunningTime="2026-04-20 21:50:09.265873375 +0000 UTC m=+178.074560741" Apr 20 21:50:09.787231 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:09.787176 2566 patch_prober.go:28] interesting pod/image-registry-7b5f674b89-f7wg6 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 21:50:09.787697 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:09.787238 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" podUID="de9ba6bd-1329-40c2-b819-7dce1bdd20f0" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 21:50:10.651037 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.650983 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-847b654b88-rfpdk"] Apr 20 21:50:10.654849 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.654825 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.665071 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.665048 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 21:50:10.665795 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.665766 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-847b654b88-rfpdk"] Apr 20 21:50:10.779814 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.779743 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-serving-cert\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.779814 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.779789 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-trusted-ca-bundle\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.780065 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.779853 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lrv5\" (UniqueName: \"kubernetes.io/projected/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-kube-api-access-2lrv5\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.780065 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.779877 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-oauth-serving-cert\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.780065 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.779940 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-config\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.780065 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.779967 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-service-ca\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.780065 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.779990 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-oauth-config\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.880991 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.880949 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-serving-cert\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.881501 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.881004 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-trusted-ca-bundle\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.881501 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.881062 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lrv5\" (UniqueName: \"kubernetes.io/projected/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-kube-api-access-2lrv5\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.881501 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.881089 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-oauth-serving-cert\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.881501 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.881133 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-config\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.881501 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.881164 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-service-ca\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.881501 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.881187 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-oauth-config\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.882016 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.881993 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-trusted-ca-bundle\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.882119 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.882026 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-config\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.882119 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.882044 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-oauth-serving-cert\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.882436 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.882414 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-service-ca\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.883537 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.883515 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-oauth-config\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.883684 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.883666 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-serving-cert\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.888310 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.888271 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lrv5\" (UniqueName: \"kubernetes.io/projected/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-kube-api-access-2lrv5\") pod \"console-847b654b88-rfpdk\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:10.971376 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:10.971339 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:11.256731 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:11.256693 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" event={"ID":"299181ce-50b6-4ee4-bf17-96747e88ebdf","Type":"ContainerStarted","Data":"995e1a46f55fd1909db095956a48ec19329b283a739a809989a34993044d389a"} Apr 20 21:50:11.256731 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:11.256733 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" event={"ID":"299181ce-50b6-4ee4-bf17-96747e88ebdf","Type":"ContainerStarted","Data":"21a5e15af4d2f77d6db51951255b0170bb403e852aa710723145b59b5a9aca0a"} Apr 20 21:50:11.273305 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:11.273230 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-42clz" podStartSLOduration=1.972843266 podStartE2EDuration="3.273211522s" podCreationTimestamp="2026-04-20 21:50:08 +0000 UTC" firstStartedPulling="2026-04-20 21:50:09.125703021 +0000 UTC m=+177.934390364" lastFinishedPulling="2026-04-20 21:50:10.426071274 +0000 UTC m=+179.234758620" observedRunningTime="2026-04-20 21:50:11.271524157 +0000 UTC m=+180.080211522" watchObservedRunningTime="2026-04-20 21:50:11.273211522 +0000 UTC m=+180.081898888" Apr 20 21:50:12.988627 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:12.988587 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s"] Apr 20 21:50:12.993647 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:12.993621 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" Apr 20 21:50:12.997319 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:12.997267 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-nwnmx\"" Apr 20 21:50:12.997443 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:12.997295 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 21:50:12.997666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:12.997647 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5fmtx"] Apr 20 21:50:12.997805 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:12.997773 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 21:50:13.000929 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.000909 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.007127 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.006847 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 21:50:13.007127 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.006882 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 21:50:13.007127 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.006930 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 21:50:13.007127 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.006972 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-m7svp\"" Apr 20 21:50:13.008461 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.008439 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s"] Apr 20 21:50:13.103112 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.103082 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/42dc29c7-316e-4d4b-a0e9-ea3a5161453e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gv47s\" (UID: \"42dc29c7-316e-4d4b-a0e9-ea3a5161453e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" Apr 20 21:50:13.103318 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.103168 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b715204d-ff93-47f6-a9fa-362fdf6c9628-node-exporter-wtmp\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.103318 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.103219 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdznp\" (UniqueName: \"kubernetes.io/projected/42dc29c7-316e-4d4b-a0e9-ea3a5161453e-kube-api-access-mdznp\") pod \"openshift-state-metrics-9d44df66c-gv47s\" (UID: \"42dc29c7-316e-4d4b-a0e9-ea3a5161453e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" Apr 20 21:50:13.103318 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.103266 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q8td\" (UniqueName: \"kubernetes.io/projected/b715204d-ff93-47f6-a9fa-362fdf6c9628-kube-api-access-5q8td\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.103318 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.103311 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b715204d-ff93-47f6-a9fa-362fdf6c9628-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.103570 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.103360 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b715204d-ff93-47f6-a9fa-362fdf6c9628-node-exporter-textfile\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.103570 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.103384 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b715204d-ff93-47f6-a9fa-362fdf6c9628-node-exporter-tls\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.103570 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.103408 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b715204d-ff93-47f6-a9fa-362fdf6c9628-root\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.103570 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.103459 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b715204d-ff93-47f6-a9fa-362fdf6c9628-sys\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.103570 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.103526 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b715204d-ff93-47f6-a9fa-362fdf6c9628-node-exporter-accelerators-collector-config\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.103570 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.103562 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42dc29c7-316e-4d4b-a0e9-ea3a5161453e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gv47s\" (UID: \"42dc29c7-316e-4d4b-a0e9-ea3a5161453e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" Apr 20 21:50:13.103826 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.103593 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b715204d-ff93-47f6-a9fa-362fdf6c9628-metrics-client-ca\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.103826 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.103636 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/42dc29c7-316e-4d4b-a0e9-ea3a5161453e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gv47s\" (UID: \"42dc29c7-316e-4d4b-a0e9-ea3a5161453e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" Apr 20 21:50:13.204832 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.204793 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q8td\" (UniqueName: \"kubernetes.io/projected/b715204d-ff93-47f6-a9fa-362fdf6c9628-kube-api-access-5q8td\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.205005 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.204844 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b715204d-ff93-47f6-a9fa-362fdf6c9628-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.205071 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.205015 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b715204d-ff93-47f6-a9fa-362fdf6c9628-node-exporter-textfile\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.205271 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.205057 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b715204d-ff93-47f6-a9fa-362fdf6c9628-node-exporter-tls\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.205391 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.205309 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b715204d-ff93-47f6-a9fa-362fdf6c9628-root\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.205391 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.205341 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b715204d-ff93-47f6-a9fa-362fdf6c9628-sys\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.205391 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.205375 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b715204d-ff93-47f6-a9fa-362fdf6c9628-node-exporter-accelerators-collector-config\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.205540 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.205401 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42dc29c7-316e-4d4b-a0e9-ea3a5161453e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gv47s\" (UID: \"42dc29c7-316e-4d4b-a0e9-ea3a5161453e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" Apr 20 21:50:13.205540 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.205430 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b715204d-ff93-47f6-a9fa-362fdf6c9628-metrics-client-ca\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.205540 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.205477 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/42dc29c7-316e-4d4b-a0e9-ea3a5161453e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gv47s\" (UID: \"42dc29c7-316e-4d4b-a0e9-ea3a5161453e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" Apr 20 21:50:13.205540 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.205493 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b715204d-ff93-47f6-a9fa-362fdf6c9628-sys\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.205758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.205553 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/42dc29c7-316e-4d4b-a0e9-ea3a5161453e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gv47s\" (UID: \"42dc29c7-316e-4d4b-a0e9-ea3a5161453e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" Apr 20 21:50:13.205758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.205574 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b715204d-ff93-47f6-a9fa-362fdf6c9628-root\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.205758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.205587 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b715204d-ff93-47f6-a9fa-362fdf6c9628-node-exporter-wtmp\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.205758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.205623 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdznp\" (UniqueName: \"kubernetes.io/projected/42dc29c7-316e-4d4b-a0e9-ea3a5161453e-kube-api-access-mdznp\") pod \"openshift-state-metrics-9d44df66c-gv47s\" (UID: \"42dc29c7-316e-4d4b-a0e9-ea3a5161453e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" Apr 20 21:50:13.206136 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.206049 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b715204d-ff93-47f6-a9fa-362fdf6c9628-node-exporter-wtmp\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.206136 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.206086 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b715204d-ff93-47f6-a9fa-362fdf6c9628-node-exporter-textfile\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.206309 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.206189 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42dc29c7-316e-4d4b-a0e9-ea3a5161453e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gv47s\" (UID: \"42dc29c7-316e-4d4b-a0e9-ea3a5161453e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" Apr 20 21:50:13.206607 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.206583 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b715204d-ff93-47f6-a9fa-362fdf6c9628-node-exporter-accelerators-collector-config\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.206707 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.206641 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b715204d-ff93-47f6-a9fa-362fdf6c9628-metrics-client-ca\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.208003 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.207982 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b715204d-ff93-47f6-a9fa-362fdf6c9628-node-exporter-tls\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.208457 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.208433 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b715204d-ff93-47f6-a9fa-362fdf6c9628-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.208548 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.208473 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/42dc29c7-316e-4d4b-a0e9-ea3a5161453e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gv47s\" (UID: \"42dc29c7-316e-4d4b-a0e9-ea3a5161453e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" Apr 20 21:50:13.208594 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.208582 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/42dc29c7-316e-4d4b-a0e9-ea3a5161453e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gv47s\" (UID: \"42dc29c7-316e-4d4b-a0e9-ea3a5161453e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" Apr 20 21:50:13.215704 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.215660 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q8td\" (UniqueName: \"kubernetes.io/projected/b715204d-ff93-47f6-a9fa-362fdf6c9628-kube-api-access-5q8td\") pod \"node-exporter-5fmtx\" (UID: \"b715204d-ff93-47f6-a9fa-362fdf6c9628\") " pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:13.217245 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.217219 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdznp\" (UniqueName: \"kubernetes.io/projected/42dc29c7-316e-4d4b-a0e9-ea3a5161453e-kube-api-access-mdznp\") pod \"openshift-state-metrics-9d44df66c-gv47s\" (UID: \"42dc29c7-316e-4d4b-a0e9-ea3a5161453e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" Apr 20 21:50:13.304306 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.304212 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" Apr 20 21:50:13.312890 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:13.312864 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5fmtx" Apr 20 21:50:14.098302 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.094043 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:50:14.106692 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.101504 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:50:14.106692 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.101530 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.106692 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.106433 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 21:50:14.113470 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.108021 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 21:50:14.113470 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.108332 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 21:50:14.113470 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.110804 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 21:50:14.113470 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.110836 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 21:50:14.113470 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.110870 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 21:50:14.113470 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.111576 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 21:50:14.113470 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.111751 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 21:50:14.113470 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.111949 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 21:50:14.113470 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.112040 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-l4nr9\"" Apr 20 21:50:14.214505 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.214470 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.214662 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.214518 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f962aea3-92ba-43bf-849e-8aadf364cbd5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.214662 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.214564 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f962aea3-92ba-43bf-849e-8aadf364cbd5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.214662 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.214581 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f962aea3-92ba-43bf-849e-8aadf364cbd5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.214662 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.214607 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f962aea3-92ba-43bf-849e-8aadf364cbd5-config-out\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.214662 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.214639 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.214662 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.214657 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f962aea3-92ba-43bf-849e-8aadf364cbd5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.214929 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.214685 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2pfb\" (UniqueName: \"kubernetes.io/projected/f962aea3-92ba-43bf-849e-8aadf364cbd5-kube-api-access-s2pfb\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.214929 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.214751 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.214929 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.214832 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.214929 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.214865 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-web-config\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.214929 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.214893 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-config-volume\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.215116 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.214938 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.315416 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.315379 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.315606 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.315436 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.315606 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.315461 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f962aea3-92ba-43bf-849e-8aadf364cbd5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.315606 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.315493 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f962aea3-92ba-43bf-849e-8aadf364cbd5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.315606 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.315516 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f962aea3-92ba-43bf-849e-8aadf364cbd5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.315800 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.315713 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f962aea3-92ba-43bf-849e-8aadf364cbd5-config-out\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.315800 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.315781 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.315899 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.315817 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f962aea3-92ba-43bf-849e-8aadf364cbd5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.315899 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.315854 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2pfb\" (UniqueName: \"kubernetes.io/projected/f962aea3-92ba-43bf-849e-8aadf364cbd5-kube-api-access-s2pfb\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.315899 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.315889 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.316050 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.315919 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.316050 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.315943 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-web-config\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.316050 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.315971 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-config-volume\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.316884 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:50:14.316373 2566 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 20 21:50:14.316884 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.316395 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f962aea3-92ba-43bf-849e-8aadf364cbd5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.316884 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.316406 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f962aea3-92ba-43bf-849e-8aadf364cbd5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.316884 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:50:14.316444 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-main-tls podName:f962aea3-92ba-43bf-849e-8aadf364cbd5 nodeName:}" failed. No retries permitted until 2026-04-20 21:50:14.816424705 +0000 UTC m=+183.625112049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "f962aea3-92ba-43bf-849e-8aadf364cbd5") : secret "alertmanager-main-tls" not found Apr 20 21:50:14.316884 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.316541 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f962aea3-92ba-43bf-849e-8aadf364cbd5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.319090 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.319043 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.319090 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.319055 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f962aea3-92ba-43bf-849e-8aadf364cbd5-config-out\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.319737 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.319601 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f962aea3-92ba-43bf-849e-8aadf364cbd5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.320274 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.320252 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-config-volume\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.320472 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.320452 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.320547 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.320532 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-web-config\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.321084 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.321062 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.321324 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.321302 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.324988 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.324967 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2pfb\" (UniqueName: \"kubernetes.io/projected/f962aea3-92ba-43bf-849e-8aadf364cbd5-kube-api-access-s2pfb\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.820845 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.820808 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.823674 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.823644 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:14.979138 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.979093 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-66755bd5f8-nctmr"] Apr 20 21:50:14.985518 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.985480 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:14.988251 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.988223 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 21:50:14.988398 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.988223 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-6gca4vuikv9n7\"" Apr 20 21:50:14.988656 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.988635 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 21:50:14.988751 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.988686 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-tb254\"" Apr 20 21:50:14.988751 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.988743 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 21:50:14.989434 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.989416 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 21:50:14.989514 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.989423 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 21:50:14.995546 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:14.995518 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-66755bd5f8-nctmr"] Apr 20 21:50:15.022665 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.022621 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:50:15.123862 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.123777 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.123862 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.123828 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-grpc-tls\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.123862 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.123853 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjkbc\" (UniqueName: \"kubernetes.io/projected/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-kube-api-access-gjkbc\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.124542 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.123914 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-metrics-client-ca\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.124542 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.124035 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.124542 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.124103 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-thanos-querier-tls\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.124542 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.124144 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.124542 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.124178 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.224899 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.224866 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.225098 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.224922 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-thanos-querier-tls\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.225098 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.224966 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.225098 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.225012 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.225098 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.225055 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.225098 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.225084 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-grpc-tls\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.225392 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.225194 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjkbc\" (UniqueName: \"kubernetes.io/projected/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-kube-api-access-gjkbc\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.225392 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.225241 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-metrics-client-ca\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.226060 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.226012 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-metrics-client-ca\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.228048 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.227913 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.228048 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.228000 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.228230 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.228083 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.228306 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.228259 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-grpc-tls\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.228509 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.228491 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-thanos-querier-tls\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.229387 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.229361 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.232698 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.232675 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjkbc\" (UniqueName: \"kubernetes.io/projected/cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00-kube-api-access-gjkbc\") pod \"thanos-querier-66755bd5f8-nctmr\" (UID: \"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00\") " pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.296826 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.296801 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:15.974841 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.974809 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:15.975037 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.974851 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:15.980354 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:15.980321 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:16.276312 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:16.276210 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:17.761219 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:50:17.761181 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb715204d_ff93_47f6_a9fa_362fdf6c9628.slice/crio-5a2298fe08dc8862d44480f724ab49c19a3dd2d45c0f4c205f503080e95a3848 WatchSource:0}: Error finding container 5a2298fe08dc8862d44480f724ab49c19a3dd2d45c0f4c205f503080e95a3848: Status 404 returned error can't find the container with id 5a2298fe08dc8862d44480f724ab49c19a3dd2d45c0f4c205f503080e95a3848 Apr 20 21:50:17.921801 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:17.921777 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-66755bd5f8-nctmr"] Apr 20 21:50:17.924509 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:50:17.924482 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfe1d2ca_d19d_496d_b7bf_a3cc60d1cb00.slice/crio-fa0cfac639ef86133564dd718bfc996e651293ee304f51f189a95afd6cacb289 WatchSource:0}: Error finding container fa0cfac639ef86133564dd718bfc996e651293ee304f51f189a95afd6cacb289: Status 404 returned error can't find the container with id fa0cfac639ef86133564dd718bfc996e651293ee304f51f189a95afd6cacb289 Apr 20 21:50:17.966375 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:17.966352 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-847b654b88-rfpdk"] Apr 20 21:50:17.967928 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:50:17.967905 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaedeb3a7_3b0e_427c_92e3_efcf4c2cb772.slice/crio-55b722514a62b5f786760a57caf8ba8b2873f4d81efc70b4ec651edb56c72029 WatchSource:0}: Error finding container 55b722514a62b5f786760a57caf8ba8b2873f4d81efc70b4ec651edb56c72029: Status 404 returned error can't find the container with id 55b722514a62b5f786760a57caf8ba8b2873f4d81efc70b4ec651edb56c72029 Apr 20 21:50:18.143356 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:18.143299 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s"] Apr 20 21:50:18.147777 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:50:18.147262 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42dc29c7_316e_4d4b_a0e9_ea3a5161453e.slice/crio-15c5ec2821ed1046c81176b67aa13da70db468d5fdfcabb24c6dde17654acc6e WatchSource:0}: Error finding container 15c5ec2821ed1046c81176b67aa13da70db468d5fdfcabb24c6dde17654acc6e: Status 404 returned error can't find the container with id 15c5ec2821ed1046c81176b67aa13da70db468d5fdfcabb24c6dde17654acc6e Apr 20 21:50:18.150942 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:18.150917 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:50:18.153673 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:50:18.153621 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf962aea3_92ba_43bf_849e_8aadf364cbd5.slice/crio-91596c2c08db2580b30897d2058568179a6d8d6278b6ce4c91c7c7f32d287a3e WatchSource:0}: Error finding container 91596c2c08db2580b30897d2058568179a6d8d6278b6ce4c91c7c7f32d287a3e: Status 404 returned error can't find the container with id 91596c2c08db2580b30897d2058568179a6d8d6278b6ce4c91c7c7f32d287a3e Apr 20 21:50:18.283163 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:18.283128 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" event={"ID":"42dc29c7-316e-4d4b-a0e9-ea3a5161453e","Type":"ContainerStarted","Data":"5009cd2ad75e59dcc4ce536ad5939f77849b3d86b4631ebf1cb3b3a32f15ac5c"} Apr 20 21:50:18.283271 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:18.283175 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" event={"ID":"42dc29c7-316e-4d4b-a0e9-ea3a5161453e","Type":"ContainerStarted","Data":"15c5ec2821ed1046c81176b67aa13da70db468d5fdfcabb24c6dde17654acc6e"} Apr 20 21:50:18.284435 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:18.284400 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5fmtx" event={"ID":"b715204d-ff93-47f6-a9fa-362fdf6c9628","Type":"ContainerStarted","Data":"5a2298fe08dc8862d44480f724ab49c19a3dd2d45c0f4c205f503080e95a3848"} Apr 20 21:50:18.286641 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:18.286416 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-wz52s" event={"ID":"c6a9fd05-b4c9-4635-889f-49259cf8782a","Type":"ContainerStarted","Data":"968080017f301579e800565c231a8c675047ded8961f1aa1149ec3f5a99c426e"} Apr 20 21:50:18.286871 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:18.286691 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-wz52s" Apr 20 21:50:18.288334 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:18.288306 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-847b654b88-rfpdk" event={"ID":"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772","Type":"ContainerStarted","Data":"9be5f68555610c8454898b86141d99520942f36bc7e9302bf75722c5d6a5627e"} Apr 20 21:50:18.288428 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:18.288339 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-847b654b88-rfpdk" event={"ID":"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772","Type":"ContainerStarted","Data":"55b722514a62b5f786760a57caf8ba8b2873f4d81efc70b4ec651edb56c72029"} Apr 20 21:50:18.290012 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:18.289934 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f962aea3-92ba-43bf-849e-8aadf364cbd5","Type":"ContainerStarted","Data":"91596c2c08db2580b30897d2058568179a6d8d6278b6ce4c91c7c7f32d287a3e"} Apr 20 21:50:18.291180 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:18.291153 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" event={"ID":"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00","Type":"ContainerStarted","Data":"fa0cfac639ef86133564dd718bfc996e651293ee304f51f189a95afd6cacb289"} Apr 20 21:50:18.299550 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:18.299522 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-wz52s" Apr 20 21:50:18.308119 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:18.308040 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-wz52s" podStartSLOduration=1.659209902 podStartE2EDuration="19.308027819s" podCreationTimestamp="2026-04-20 21:49:59 +0000 UTC" firstStartedPulling="2026-04-20 21:50:00.237155922 +0000 UTC m=+169.045843265" lastFinishedPulling="2026-04-20 21:50:17.885973824 +0000 UTC m=+186.694661182" observedRunningTime="2026-04-20 21:50:18.306418441 +0000 UTC m=+187.115105828" watchObservedRunningTime="2026-04-20 21:50:18.308027819 +0000 UTC m=+187.116715184" Apr 20 21:50:18.324206 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:18.324139 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-847b654b88-rfpdk" podStartSLOduration=8.324122898 podStartE2EDuration="8.324122898s" podCreationTimestamp="2026-04-20 21:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:50:18.322972707 +0000 UTC m=+187.131660121" watchObservedRunningTime="2026-04-20 21:50:18.324122898 +0000 UTC m=+187.132810263" Apr 20 21:50:19.298799 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.298748 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" event={"ID":"42dc29c7-316e-4d4b-a0e9-ea3a5161453e","Type":"ContainerStarted","Data":"6a55c8abc93539cf0e553c155e2b301292a381dc308f29bacad97570d6d00c6f"} Apr 20 21:50:19.301262 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.300739 2566 generic.go:358] "Generic (PLEG): container finished" podID="b715204d-ff93-47f6-a9fa-362fdf6c9628" containerID="4af843852972c53297f8cff0eb52f82f7e4dbbd23167fa86a312c3a9cc7f4124" exitCode=0 Apr 20 21:50:19.301262 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.301168 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5fmtx" event={"ID":"b715204d-ff93-47f6-a9fa-362fdf6c9628","Type":"ContainerDied","Data":"4af843852972c53297f8cff0eb52f82f7e4dbbd23167fa86a312c3a9cc7f4124"} Apr 20 21:50:19.470176 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.470130 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-847b654b88-rfpdk"] Apr 20 21:50:19.502728 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.501800 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5475666778-27zs5"] Apr 20 21:50:19.508330 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.508301 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.515131 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.515085 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5475666778-27zs5"] Apr 20 21:50:19.671736 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.670808 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-oauth-serving-cert\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.671736 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.670876 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a628e407-40cc-4e12-b139-946e014e4c8e-console-serving-cert\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.671736 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.670904 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54rlx\" (UniqueName: \"kubernetes.io/projected/a628e407-40cc-4e12-b139-946e014e4c8e-kube-api-access-54rlx\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.671736 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.670942 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a628e407-40cc-4e12-b139-946e014e4c8e-console-oauth-config\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.671736 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.670966 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-trusted-ca-bundle\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.671736 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.670995 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-service-ca\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.671736 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.671037 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-console-config\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.774597 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.772930 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a628e407-40cc-4e12-b139-946e014e4c8e-console-serving-cert\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.774597 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.772997 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54rlx\" (UniqueName: \"kubernetes.io/projected/a628e407-40cc-4e12-b139-946e014e4c8e-kube-api-access-54rlx\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.774597 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.773054 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a628e407-40cc-4e12-b139-946e014e4c8e-console-oauth-config\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.774597 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.773079 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-trusted-ca-bundle\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.774597 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.773115 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-service-ca\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.774597 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.773170 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-console-config\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.774597 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.773270 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-oauth-serving-cert\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.774597 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.774015 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-oauth-serving-cert\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.774597 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.774591 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-service-ca\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.775175 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.774793 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-console-config\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.776975 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.775870 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-trusted-ca-bundle\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.776975 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.776944 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a628e407-40cc-4e12-b139-946e014e4c8e-console-serving-cert\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.784405 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.784357 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a628e407-40cc-4e12-b139-946e014e4c8e-console-oauth-config\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.784553 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.784523 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54rlx\" (UniqueName: \"kubernetes.io/projected/a628e407-40cc-4e12-b139-946e014e4c8e-kube-api-access-54rlx\") pod \"console-5475666778-27zs5\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:19.787209 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.787190 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:50:19.824741 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:19.824302 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:20.971895 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:20.971861 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:21.381185 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:21.380875 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5475666778-27zs5"] Apr 20 21:50:21.387189 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:50:21.386023 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda628e407_40cc_4e12_b139_946e014e4c8e.slice/crio-6577ab426fa336ec1eb92be95411bec8fc315856d10c6851c847d312813aa09b WatchSource:0}: Error finding container 6577ab426fa336ec1eb92be95411bec8fc315856d10c6851c847d312813aa09b: Status 404 returned error can't find the container with id 6577ab426fa336ec1eb92be95411bec8fc315856d10c6851c847d312813aa09b Apr 20 21:50:22.314128 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:22.314091 2566 generic.go:358] "Generic (PLEG): container finished" podID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerID="8b774505bde89345f97fe8d9c125c77d65e0c3a8348a59503f2bb85ee2826d11" exitCode=0 Apr 20 21:50:22.314677 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:22.314472 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f962aea3-92ba-43bf-849e-8aadf364cbd5","Type":"ContainerDied","Data":"8b774505bde89345f97fe8d9c125c77d65e0c3a8348a59503f2bb85ee2826d11"} Apr 20 21:50:22.318170 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:22.318129 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" event={"ID":"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00","Type":"ContainerStarted","Data":"b71acc4dd227cda91f5481c98b21f23b918c28141307bde0e8a152e160763756"} Apr 20 21:50:22.318274 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:22.318180 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" event={"ID":"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00","Type":"ContainerStarted","Data":"87959fb089a0f762d023ff4f21198ebb939273c018ca3cbd850319d99a1b8f76"} Apr 20 21:50:22.318274 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:22.318197 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" event={"ID":"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00","Type":"ContainerStarted","Data":"939cf4633b93c037ed61ccb93a815018ec68a5a67fd56c1a328aafb47a7b4ae3"} Apr 20 21:50:22.319830 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:22.319801 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5475666778-27zs5" event={"ID":"a628e407-40cc-4e12-b139-946e014e4c8e","Type":"ContainerStarted","Data":"4ea7a9b251a9530e2a211e31b4488167fea7a5934fd70fc1e488484f70834fef"} Apr 20 21:50:22.319943 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:22.319837 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5475666778-27zs5" event={"ID":"a628e407-40cc-4e12-b139-946e014e4c8e","Type":"ContainerStarted","Data":"6577ab426fa336ec1eb92be95411bec8fc315856d10c6851c847d312813aa09b"} Apr 20 21:50:22.321853 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:22.321831 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" event={"ID":"42dc29c7-316e-4d4b-a0e9-ea3a5161453e","Type":"ContainerStarted","Data":"0e550ebd226c0cb4ea0c57b6b6f3a2666f9aa1aea5a6d499d03397cce96aa40a"} Apr 20 21:50:22.323997 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:22.323975 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5fmtx" event={"ID":"b715204d-ff93-47f6-a9fa-362fdf6c9628","Type":"ContainerStarted","Data":"a25945370e06bfb7b0b746b9bbbf66f3b1d4824b04f543f55caa900892443006"} Apr 20 21:50:22.324095 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:22.323998 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5fmtx" event={"ID":"b715204d-ff93-47f6-a9fa-362fdf6c9628","Type":"ContainerStarted","Data":"7aa1e6c5deb5ccaf2e1e1df2cf39986116c42cc5ad8cf518c146ecccda034864"} Apr 20 21:50:22.390199 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:22.390150 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5fmtx" podStartSLOduration=9.579275688 podStartE2EDuration="10.390137619s" podCreationTimestamp="2026-04-20 21:50:12 +0000 UTC" firstStartedPulling="2026-04-20 21:50:17.765327306 +0000 UTC m=+186.574014658" lastFinishedPulling="2026-04-20 21:50:18.576189246 +0000 UTC m=+187.384876589" observedRunningTime="2026-04-20 21:50:22.388121489 +0000 UTC m=+191.196808881" watchObservedRunningTime="2026-04-20 21:50:22.390137619 +0000 UTC m=+191.198825001" Apr 20 21:50:22.413364 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:22.413311 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5475666778-27zs5" podStartSLOduration=3.413295819 podStartE2EDuration="3.413295819s" podCreationTimestamp="2026-04-20 21:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:50:22.411430111 +0000 UTC m=+191.220117498" watchObservedRunningTime="2026-04-20 21:50:22.413295819 +0000 UTC m=+191.221983179" Apr 20 21:50:22.428870 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:22.428820 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gv47s" podStartSLOduration=7.513711638 podStartE2EDuration="10.428808001s" podCreationTimestamp="2026-04-20 21:50:12 +0000 UTC" firstStartedPulling="2026-04-20 21:50:18.312995452 +0000 UTC m=+187.121682806" lastFinishedPulling="2026-04-20 21:50:21.228091812 +0000 UTC m=+190.036779169" observedRunningTime="2026-04-20 21:50:22.427388059 +0000 UTC m=+191.236075423" watchObservedRunningTime="2026-04-20 21:50:22.428808001 +0000 UTC m=+191.237495362" Apr 20 21:50:23.333759 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:23.333720 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" event={"ID":"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00","Type":"ContainerStarted","Data":"32a461e84c47ffe3e3d7dd4a4d74e128e7f348897210dcf1263e313c7dcca405"} Apr 20 21:50:23.334225 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:23.333775 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" event={"ID":"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00","Type":"ContainerStarted","Data":"48ee7525dd7c037c43be85785f08f2475569d223ac1914e05c291b9026ee35c7"} Apr 20 21:50:24.339515 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:24.339474 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" event={"ID":"cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00","Type":"ContainerStarted","Data":"4eea6adf811604520a51442736f781b7f5df26f363a58232efedd9c13978f2d8"} Apr 20 21:50:24.339960 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:24.339689 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:24.364130 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:24.364073 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" podStartSLOduration=5.266294072 podStartE2EDuration="10.364050503s" podCreationTimestamp="2026-04-20 21:50:14 +0000 UTC" firstStartedPulling="2026-04-20 21:50:17.926559819 +0000 UTC m=+186.735247176" lastFinishedPulling="2026-04-20 21:50:23.02431625 +0000 UTC m=+191.833003607" observedRunningTime="2026-04-20 21:50:24.361326181 +0000 UTC m=+193.170013546" watchObservedRunningTime="2026-04-20 21:50:24.364050503 +0000 UTC m=+193.172737869" Apr 20 21:50:24.802432 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:24.802369 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" podUID="de9ba6bd-1329-40c2-b819-7dce1bdd20f0" containerName="registry" containerID="cri-o://db69d042c69cdd5d24e410a1ab1e2d39cbb647689ec47ee4a73639c7e76f3a6a" gracePeriod=30 Apr 20 21:50:25.064854 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.064832 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:50:25.229267 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.229235 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfbfv\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-kube-api-access-zfbfv\") pod \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " Apr 20 21:50:25.229400 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.229299 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-image-registry-private-configuration\") pod \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " Apr 20 21:50:25.229463 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.229401 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-ca-trust-extracted\") pod \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " Apr 20 21:50:25.229463 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.229434 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-trusted-ca\") pod \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " Apr 20 21:50:25.229571 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.229479 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-certificates\") pod \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " Apr 20 21:50:25.229571 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.229515 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-bound-sa-token\") pod \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " Apr 20 21:50:25.229571 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.229567 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-installation-pull-secrets\") pod \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " Apr 20 21:50:25.229716 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.229592 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls\") pod \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\" (UID: \"de9ba6bd-1329-40c2-b819-7dce1bdd20f0\") " Apr 20 21:50:25.230023 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.229842 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "de9ba6bd-1329-40c2-b819-7dce1bdd20f0" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:50:25.230327 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.230244 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "de9ba6bd-1329-40c2-b819-7dce1bdd20f0" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:50:25.233461 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.233413 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "de9ba6bd-1329-40c2-b819-7dce1bdd20f0" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:50:25.234141 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.234102 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "de9ba6bd-1329-40c2-b819-7dce1bdd20f0" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:50:25.234454 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.234411 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "de9ba6bd-1329-40c2-b819-7dce1bdd20f0" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:50:25.234631 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.234532 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "de9ba6bd-1329-40c2-b819-7dce1bdd20f0" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:50:25.234877 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.234844 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-kube-api-access-zfbfv" (OuterVolumeSpecName: "kube-api-access-zfbfv") pod "de9ba6bd-1329-40c2-b819-7dce1bdd20f0" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0"). InnerVolumeSpecName "kube-api-access-zfbfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:50:25.242166 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.242142 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "de9ba6bd-1329-40c2-b819-7dce1bdd20f0" (UID: "de9ba6bd-1329-40c2-b819-7dce1bdd20f0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:50:25.330949 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.330920 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zfbfv\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-kube-api-access-zfbfv\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:25.331058 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.330953 2566 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-image-registry-private-configuration\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:25.331058 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.330968 2566 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-ca-trust-extracted\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:25.331058 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.330978 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-trusted-ca\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:25.331058 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.330987 2566 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-certificates\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:25.331058 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.330998 2566 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-bound-sa-token\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:25.331058 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.331012 2566 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-installation-pull-secrets\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:25.331058 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.331026 2566 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de9ba6bd-1329-40c2-b819-7dce1bdd20f0-registry-tls\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:25.345741 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.345712 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f962aea3-92ba-43bf-849e-8aadf364cbd5","Type":"ContainerStarted","Data":"1407867ce04512a9689b6c18f842fffbb2a7056ab11c256fe0f78af1e497f48d"} Apr 20 21:50:25.346084 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.345751 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f962aea3-92ba-43bf-849e-8aadf364cbd5","Type":"ContainerStarted","Data":"79a9cf0d725b6f9563babdac3b5d37dbd8a596330df301c1a35fb8667473a565"} Apr 20 21:50:25.346084 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.345764 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f962aea3-92ba-43bf-849e-8aadf364cbd5","Type":"ContainerStarted","Data":"9908adc160c639ca59c577669011381e9272ae21cc0646c71ce482d49046f014"} Apr 20 21:50:25.346084 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.345776 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f962aea3-92ba-43bf-849e-8aadf364cbd5","Type":"ContainerStarted","Data":"92d9d326151fb965fa54fac8abe44afe11a7f64a31404c114e4b7d5f9bc5e0ef"} Apr 20 21:50:25.346084 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.345789 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f962aea3-92ba-43bf-849e-8aadf364cbd5","Type":"ContainerStarted","Data":"115de3192c08defb35100c052035a871b3c0fe4999c57712b98a18911fe95bbe"} Apr 20 21:50:25.347037 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.347011 2566 generic.go:358] "Generic (PLEG): container finished" podID="de9ba6bd-1329-40c2-b819-7dce1bdd20f0" containerID="db69d042c69cdd5d24e410a1ab1e2d39cbb647689ec47ee4a73639c7e76f3a6a" exitCode=0 Apr 20 21:50:25.347329 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.347295 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" event={"ID":"de9ba6bd-1329-40c2-b819-7dce1bdd20f0","Type":"ContainerDied","Data":"db69d042c69cdd5d24e410a1ab1e2d39cbb647689ec47ee4a73639c7e76f3a6a"} Apr 20 21:50:25.347416 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.347340 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" event={"ID":"de9ba6bd-1329-40c2-b819-7dce1bdd20f0","Type":"ContainerDied","Data":"68936f7021935f1ebb8e6995eed81aecb0af54db35c0d029ae2a4bad23d36087"} Apr 20 21:50:25.347416 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.347361 2566 scope.go:117] "RemoveContainer" containerID="db69d042c69cdd5d24e410a1ab1e2d39cbb647689ec47ee4a73639c7e76f3a6a" Apr 20 21:50:25.347529 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.347495 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b5f674b89-f7wg6" Apr 20 21:50:25.356234 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.356219 2566 scope.go:117] "RemoveContainer" containerID="db69d042c69cdd5d24e410a1ab1e2d39cbb647689ec47ee4a73639c7e76f3a6a" Apr 20 21:50:25.356506 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:50:25.356486 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db69d042c69cdd5d24e410a1ab1e2d39cbb647689ec47ee4a73639c7e76f3a6a\": container with ID starting with db69d042c69cdd5d24e410a1ab1e2d39cbb647689ec47ee4a73639c7e76f3a6a not found: ID does not exist" containerID="db69d042c69cdd5d24e410a1ab1e2d39cbb647689ec47ee4a73639c7e76f3a6a" Apr 20 21:50:25.356587 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.356515 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db69d042c69cdd5d24e410a1ab1e2d39cbb647689ec47ee4a73639c7e76f3a6a"} err="failed to get container status \"db69d042c69cdd5d24e410a1ab1e2d39cbb647689ec47ee4a73639c7e76f3a6a\": rpc error: code = NotFound desc = could not find container \"db69d042c69cdd5d24e410a1ab1e2d39cbb647689ec47ee4a73639c7e76f3a6a\": container with ID starting with db69d042c69cdd5d24e410a1ab1e2d39cbb647689ec47ee4a73639c7e76f3a6a not found: ID does not exist" Apr 20 21:50:25.381268 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.381245 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7b5f674b89-f7wg6"] Apr 20 21:50:25.384720 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.384672 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7b5f674b89-f7wg6"] Apr 20 21:50:25.774157 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:25.774122 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de9ba6bd-1329-40c2-b819-7dce1bdd20f0" path="/var/lib/kubelet/pods/de9ba6bd-1329-40c2-b819-7dce1bdd20f0/volumes" Apr 20 21:50:26.354941 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:26.354911 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f962aea3-92ba-43bf-849e-8aadf364cbd5","Type":"ContainerStarted","Data":"ec50d5a72263e8031ea2d32ea1f91c954f84436e27537f9edf1a6d1e01c58bc1"} Apr 20 21:50:26.381982 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:26.381938 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=5.94664671 podStartE2EDuration="12.381921603s" podCreationTimestamp="2026-04-20 21:50:14 +0000 UTC" firstStartedPulling="2026-04-20 21:50:18.155885063 +0000 UTC m=+186.964572419" lastFinishedPulling="2026-04-20 21:50:24.591159955 +0000 UTC m=+193.399847312" observedRunningTime="2026-04-20 21:50:26.380466456 +0000 UTC m=+195.189153846" watchObservedRunningTime="2026-04-20 21:50:26.381921603 +0000 UTC m=+195.190608970" Apr 20 21:50:29.825166 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:29.825133 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:29.825166 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:29.825174 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:29.829592 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:29.829572 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:30.354574 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:30.354540 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-66755bd5f8-nctmr" Apr 20 21:50:30.370998 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:30.370975 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5475666778-27zs5" Apr 20 21:50:30.422861 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:30.422829 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66854c8fcc-xws46"] Apr 20 21:50:45.328733 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.328678 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-847b654b88-rfpdk" podUID="aedeb3a7-3b0e-427c-92e3-efcf4c2cb772" containerName="console" containerID="cri-o://9be5f68555610c8454898b86141d99520942f36bc7e9302bf75722c5d6a5627e" gracePeriod=15 Apr 20 21:50:45.568852 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.568830 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-847b654b88-rfpdk_aedeb3a7-3b0e-427c-92e3-efcf4c2cb772/console/0.log" Apr 20 21:50:45.568955 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.568888 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:45.603238 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.603172 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-trusted-ca-bundle\") pod \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " Apr 20 21:50:45.603238 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.603207 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lrv5\" (UniqueName: \"kubernetes.io/projected/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-kube-api-access-2lrv5\") pod \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " Apr 20 21:50:45.603438 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.603242 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-serving-cert\") pod \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " Apr 20 21:50:45.603438 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.603400 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-service-ca\") pod \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " Apr 20 21:50:45.603542 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.603473 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-oauth-serving-cert\") pod \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " Apr 20 21:50:45.603598 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.603557 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-config\") pod \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " Apr 20 21:50:45.603598 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.603584 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-oauth-config\") pod \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\" (UID: \"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772\") " Apr 20 21:50:45.603794 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.603742 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "aedeb3a7-3b0e-427c-92e3-efcf4c2cb772" (UID: "aedeb3a7-3b0e-427c-92e3-efcf4c2cb772"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:50:45.603905 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.603835 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-service-ca" (OuterVolumeSpecName: "service-ca") pod "aedeb3a7-3b0e-427c-92e3-efcf4c2cb772" (UID: "aedeb3a7-3b0e-427c-92e3-efcf4c2cb772"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:50:45.603905 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.603888 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-trusted-ca-bundle\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:45.603905 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.603890 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "aedeb3a7-3b0e-427c-92e3-efcf4c2cb772" (UID: "aedeb3a7-3b0e-427c-92e3-efcf4c2cb772"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:50:45.604057 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.603930 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-config" (OuterVolumeSpecName: "console-config") pod "aedeb3a7-3b0e-427c-92e3-efcf4c2cb772" (UID: "aedeb3a7-3b0e-427c-92e3-efcf4c2cb772"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:50:45.605495 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.605472 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-kube-api-access-2lrv5" (OuterVolumeSpecName: "kube-api-access-2lrv5") pod "aedeb3a7-3b0e-427c-92e3-efcf4c2cb772" (UID: "aedeb3a7-3b0e-427c-92e3-efcf4c2cb772"). InnerVolumeSpecName "kube-api-access-2lrv5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:50:45.605583 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.605531 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "aedeb3a7-3b0e-427c-92e3-efcf4c2cb772" (UID: "aedeb3a7-3b0e-427c-92e3-efcf4c2cb772"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:50:45.605632 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.605613 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "aedeb3a7-3b0e-427c-92e3-efcf4c2cb772" (UID: "aedeb3a7-3b0e-427c-92e3-efcf4c2cb772"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:50:45.704301 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.704248 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-oauth-serving-cert\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:45.704301 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.704297 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-config\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:45.704301 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.704307 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-oauth-config\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:45.704520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.704318 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2lrv5\" (UniqueName: \"kubernetes.io/projected/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-kube-api-access-2lrv5\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:45.704520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.704328 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-console-serving-cert\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:45.704520 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:45.704337 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772-service-ca\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:46.416587 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:46.416560 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-847b654b88-rfpdk_aedeb3a7-3b0e-427c-92e3-efcf4c2cb772/console/0.log" Apr 20 21:50:46.417069 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:46.416597 2566 generic.go:358] "Generic (PLEG): container finished" podID="aedeb3a7-3b0e-427c-92e3-efcf4c2cb772" containerID="9be5f68555610c8454898b86141d99520942f36bc7e9302bf75722c5d6a5627e" exitCode=2 Apr 20 21:50:46.417069 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:46.416629 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-847b654b88-rfpdk" event={"ID":"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772","Type":"ContainerDied","Data":"9be5f68555610c8454898b86141d99520942f36bc7e9302bf75722c5d6a5627e"} Apr 20 21:50:46.417069 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:46.416669 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-847b654b88-rfpdk" event={"ID":"aedeb3a7-3b0e-427c-92e3-efcf4c2cb772","Type":"ContainerDied","Data":"55b722514a62b5f786760a57caf8ba8b2873f4d81efc70b4ec651edb56c72029"} Apr 20 21:50:46.417069 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:46.416689 2566 scope.go:117] "RemoveContainer" containerID="9be5f68555610c8454898b86141d99520942f36bc7e9302bf75722c5d6a5627e" Apr 20 21:50:46.417069 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:46.416707 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-847b654b88-rfpdk" Apr 20 21:50:46.430469 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:46.430452 2566 scope.go:117] "RemoveContainer" containerID="9be5f68555610c8454898b86141d99520942f36bc7e9302bf75722c5d6a5627e" Apr 20 21:50:46.430712 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:50:46.430693 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9be5f68555610c8454898b86141d99520942f36bc7e9302bf75722c5d6a5627e\": container with ID starting with 9be5f68555610c8454898b86141d99520942f36bc7e9302bf75722c5d6a5627e not found: ID does not exist" containerID="9be5f68555610c8454898b86141d99520942f36bc7e9302bf75722c5d6a5627e" Apr 20 21:50:46.430822 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:46.430720 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be5f68555610c8454898b86141d99520942f36bc7e9302bf75722c5d6a5627e"} err="failed to get container status \"9be5f68555610c8454898b86141d99520942f36bc7e9302bf75722c5d6a5627e\": rpc error: code = NotFound desc = could not find container \"9be5f68555610c8454898b86141d99520942f36bc7e9302bf75722c5d6a5627e\": container with ID starting with 9be5f68555610c8454898b86141d99520942f36bc7e9302bf75722c5d6a5627e not found: ID does not exist" Apr 20 21:50:46.434105 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:46.434085 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-847b654b88-rfpdk"] Apr 20 21:50:46.436949 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:46.436930 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-847b654b88-rfpdk"] Apr 20 21:50:47.773006 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:47.772974 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aedeb3a7-3b0e-427c-92e3-efcf4c2cb772" path="/var/lib/kubelet/pods/aedeb3a7-3b0e-427c-92e3-efcf4c2cb772/volumes" Apr 20 21:50:55.441498 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.441460 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66854c8fcc-xws46" podUID="7bef215f-5a33-4576-950c-039084f6b70e" containerName="console" containerID="cri-o://1c6f53a3a6b98aa0114b1915569b8428c9d811982b66bfe08fd439ec24a5f6ef" gracePeriod=15 Apr 20 21:50:55.708595 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.708574 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66854c8fcc-xws46_7bef215f-5a33-4576-950c-039084f6b70e/console/0.log" Apr 20 21:50:55.708697 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.708636 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:55.782942 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.782922 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bef215f-5a33-4576-950c-039084f6b70e-console-serving-cert\") pod \"7bef215f-5a33-4576-950c-039084f6b70e\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " Apr 20 21:50:55.783060 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.782979 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-console-config\") pod \"7bef215f-5a33-4576-950c-039084f6b70e\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " Apr 20 21:50:55.783060 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.783019 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-oauth-serving-cert\") pod \"7bef215f-5a33-4576-950c-039084f6b70e\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " Apr 20 21:50:55.783145 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.783065 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bef215f-5a33-4576-950c-039084f6b70e-console-oauth-config\") pod \"7bef215f-5a33-4576-950c-039084f6b70e\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " Apr 20 21:50:55.783145 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.783108 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrlh2\" (UniqueName: \"kubernetes.io/projected/7bef215f-5a33-4576-950c-039084f6b70e-kube-api-access-qrlh2\") pod \"7bef215f-5a33-4576-950c-039084f6b70e\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " Apr 20 21:50:55.783145 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.783134 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-service-ca\") pod \"7bef215f-5a33-4576-950c-039084f6b70e\" (UID: \"7bef215f-5a33-4576-950c-039084f6b70e\") " Apr 20 21:50:55.783529 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.783495 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7bef215f-5a33-4576-950c-039084f6b70e" (UID: "7bef215f-5a33-4576-950c-039084f6b70e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:50:55.783529 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.783502 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-console-config" (OuterVolumeSpecName: "console-config") pod "7bef215f-5a33-4576-950c-039084f6b70e" (UID: "7bef215f-5a33-4576-950c-039084f6b70e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:50:55.783711 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.783571 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-service-ca" (OuterVolumeSpecName: "service-ca") pod "7bef215f-5a33-4576-950c-039084f6b70e" (UID: "7bef215f-5a33-4576-950c-039084f6b70e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:50:55.785103 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.785076 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bef215f-5a33-4576-950c-039084f6b70e-kube-api-access-qrlh2" (OuterVolumeSpecName: "kube-api-access-qrlh2") pod "7bef215f-5a33-4576-950c-039084f6b70e" (UID: "7bef215f-5a33-4576-950c-039084f6b70e"). InnerVolumeSpecName "kube-api-access-qrlh2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:50:55.785103 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.785081 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bef215f-5a33-4576-950c-039084f6b70e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7bef215f-5a33-4576-950c-039084f6b70e" (UID: "7bef215f-5a33-4576-950c-039084f6b70e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:50:55.785330 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.785123 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bef215f-5a33-4576-950c-039084f6b70e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7bef215f-5a33-4576-950c-039084f6b70e" (UID: "7bef215f-5a33-4576-950c-039084f6b70e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:50:55.883994 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.883967 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bef215f-5a33-4576-950c-039084f6b70e-console-serving-cert\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:55.883994 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.883992 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-console-config\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:55.884148 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.884003 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-oauth-serving-cert\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:55.884148 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.884012 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bef215f-5a33-4576-950c-039084f6b70e-console-oauth-config\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:55.884148 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.884022 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qrlh2\" (UniqueName: \"kubernetes.io/projected/7bef215f-5a33-4576-950c-039084f6b70e-kube-api-access-qrlh2\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:55.884148 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:55.884030 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bef215f-5a33-4576-950c-039084f6b70e-service-ca\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:50:56.444848 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:56.444825 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66854c8fcc-xws46_7bef215f-5a33-4576-950c-039084f6b70e/console/0.log" Apr 20 21:50:56.445236 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:56.444860 2566 generic.go:358] "Generic (PLEG): container finished" podID="7bef215f-5a33-4576-950c-039084f6b70e" containerID="1c6f53a3a6b98aa0114b1915569b8428c9d811982b66bfe08fd439ec24a5f6ef" exitCode=2 Apr 20 21:50:56.445236 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:56.444919 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66854c8fcc-xws46" Apr 20 21:50:56.445236 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:56.444924 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66854c8fcc-xws46" event={"ID":"7bef215f-5a33-4576-950c-039084f6b70e","Type":"ContainerDied","Data":"1c6f53a3a6b98aa0114b1915569b8428c9d811982b66bfe08fd439ec24a5f6ef"} Apr 20 21:50:56.445236 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:56.444958 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66854c8fcc-xws46" event={"ID":"7bef215f-5a33-4576-950c-039084f6b70e","Type":"ContainerDied","Data":"4316b010387dec3cece185a5700923ec970d6249681963ab9756c550549f96be"} Apr 20 21:50:56.445236 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:56.444975 2566 scope.go:117] "RemoveContainer" containerID="1c6f53a3a6b98aa0114b1915569b8428c9d811982b66bfe08fd439ec24a5f6ef" Apr 20 21:50:56.454973 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:56.454488 2566 scope.go:117] "RemoveContainer" containerID="1c6f53a3a6b98aa0114b1915569b8428c9d811982b66bfe08fd439ec24a5f6ef" Apr 20 21:50:56.455442 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:50:56.455421 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c6f53a3a6b98aa0114b1915569b8428c9d811982b66bfe08fd439ec24a5f6ef\": container with ID starting with 1c6f53a3a6b98aa0114b1915569b8428c9d811982b66bfe08fd439ec24a5f6ef not found: ID does not exist" containerID="1c6f53a3a6b98aa0114b1915569b8428c9d811982b66bfe08fd439ec24a5f6ef" Apr 20 21:50:56.455539 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:56.455454 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6f53a3a6b98aa0114b1915569b8428c9d811982b66bfe08fd439ec24a5f6ef"} err="failed to get container status \"1c6f53a3a6b98aa0114b1915569b8428c9d811982b66bfe08fd439ec24a5f6ef\": rpc error: code = NotFound desc = could not find container \"1c6f53a3a6b98aa0114b1915569b8428c9d811982b66bfe08fd439ec24a5f6ef\": container with ID starting with 1c6f53a3a6b98aa0114b1915569b8428c9d811982b66bfe08fd439ec24a5f6ef not found: ID does not exist" Apr 20 21:50:56.464535 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:56.464514 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66854c8fcc-xws46"] Apr 20 21:50:56.468471 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:56.468443 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66854c8fcc-xws46"] Apr 20 21:50:57.773566 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:50:57.773534 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bef215f-5a33-4576-950c-039084f6b70e" path="/var/lib/kubelet/pods/7bef215f-5a33-4576-950c-039084f6b70e/volumes" Apr 20 21:51:22.591428 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:22.591394 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs\") pod \"network-metrics-daemon-j8c9k\" (UID: \"86a4e942-ea9e-4978-b92e-c96688b972a3\") " pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:51:22.593536 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:22.593513 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86a4e942-ea9e-4978-b92e-c96688b972a3-metrics-certs\") pod \"network-metrics-daemon-j8c9k\" (UID: \"86a4e942-ea9e-4978-b92e-c96688b972a3\") " pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:51:22.673133 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:22.673102 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5wzfs\"" Apr 20 21:51:22.681404 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:22.681382 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j8c9k" Apr 20 21:51:22.800198 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:22.800173 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j8c9k"] Apr 20 21:51:22.802766 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:51:22.802730 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86a4e942_ea9e_4978_b92e_c96688b972a3.slice/crio-849560f09f20c7375e6bb2f7a11cddd2cb5e83819fe77638ba7d16a4bb55f40a WatchSource:0}: Error finding container 849560f09f20c7375e6bb2f7a11cddd2cb5e83819fe77638ba7d16a4bb55f40a: Status 404 returned error can't find the container with id 849560f09f20c7375e6bb2f7a11cddd2cb5e83819fe77638ba7d16a4bb55f40a Apr 20 21:51:23.518970 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:23.518928 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j8c9k" event={"ID":"86a4e942-ea9e-4978-b92e-c96688b972a3","Type":"ContainerStarted","Data":"849560f09f20c7375e6bb2f7a11cddd2cb5e83819fe77638ba7d16a4bb55f40a"} Apr 20 21:51:24.525254 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:24.525223 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j8c9k" event={"ID":"86a4e942-ea9e-4978-b92e-c96688b972a3","Type":"ContainerStarted","Data":"444e62138e887eb89ca0c793ffcd37b8734a06ee5d47db92ef5faa2c7970f6d1"} Apr 20 21:51:24.525254 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:24.525256 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j8c9k" event={"ID":"86a4e942-ea9e-4978-b92e-c96688b972a3","Type":"ContainerStarted","Data":"004deb5fb9b50bfdfe97d10e7476433ebf43a260d42e6c7fdb555197192bf06b"} Apr 20 21:51:24.544040 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:24.543993 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-j8c9k" podStartSLOduration=252.705496774 podStartE2EDuration="4m13.543975797s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="2026-04-20 21:51:22.804721357 +0000 UTC m=+251.613408704" lastFinishedPulling="2026-04-20 21:51:23.643200385 +0000 UTC m=+252.451887727" observedRunningTime="2026-04-20 21:51:24.542772256 +0000 UTC m=+253.351459617" watchObservedRunningTime="2026-04-20 21:51:24.543975797 +0000 UTC m=+253.352663166" Apr 20 21:51:33.215891 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.215805 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:51:33.216475 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.216271 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="alertmanager" containerID="cri-o://115de3192c08defb35100c052035a871b3c0fe4999c57712b98a18911fe95bbe" gracePeriod=120 Apr 20 21:51:33.216475 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.216300 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="kube-rbac-proxy-metric" containerID="cri-o://1407867ce04512a9689b6c18f842fffbb2a7056ab11c256fe0f78af1e497f48d" gracePeriod=120 Apr 20 21:51:33.216475 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.216340 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="kube-rbac-proxy-web" containerID="cri-o://9908adc160c639ca59c577669011381e9272ae21cc0646c71ce482d49046f014" gracePeriod=120 Apr 20 21:51:33.216475 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.216394 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="prom-label-proxy" containerID="cri-o://ec50d5a72263e8031ea2d32ea1f91c954f84436e27537f9edf1a6d1e01c58bc1" gracePeriod=120 Apr 20 21:51:33.216475 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.216443 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="kube-rbac-proxy" containerID="cri-o://79a9cf0d725b6f9563babdac3b5d37dbd8a596330df301c1a35fb8667473a565" gracePeriod=120 Apr 20 21:51:33.216727 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.216394 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="config-reloader" containerID="cri-o://92d9d326151fb965fa54fac8abe44afe11a7f64a31404c114e4b7d5f9bc5e0ef" gracePeriod=120 Apr 20 21:51:33.563830 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.563740 2566 generic.go:358] "Generic (PLEG): container finished" podID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerID="ec50d5a72263e8031ea2d32ea1f91c954f84436e27537f9edf1a6d1e01c58bc1" exitCode=0 Apr 20 21:51:33.563830 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.563768 2566 generic.go:358] "Generic (PLEG): container finished" podID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerID="1407867ce04512a9689b6c18f842fffbb2a7056ab11c256fe0f78af1e497f48d" exitCode=0 Apr 20 21:51:33.563830 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.563777 2566 generic.go:358] "Generic (PLEG): container finished" podID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerID="79a9cf0d725b6f9563babdac3b5d37dbd8a596330df301c1a35fb8667473a565" exitCode=0 Apr 20 21:51:33.563830 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.563782 2566 generic.go:358] "Generic (PLEG): container finished" podID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerID="92d9d326151fb965fa54fac8abe44afe11a7f64a31404c114e4b7d5f9bc5e0ef" exitCode=0 Apr 20 21:51:33.563830 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.563787 2566 generic.go:358] "Generic (PLEG): container finished" podID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerID="115de3192c08defb35100c052035a871b3c0fe4999c57712b98a18911fe95bbe" exitCode=0 Apr 20 21:51:33.563830 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.563806 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f962aea3-92ba-43bf-849e-8aadf364cbd5","Type":"ContainerDied","Data":"ec50d5a72263e8031ea2d32ea1f91c954f84436e27537f9edf1a6d1e01c58bc1"} Apr 20 21:51:33.564173 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.563837 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f962aea3-92ba-43bf-849e-8aadf364cbd5","Type":"ContainerDied","Data":"1407867ce04512a9689b6c18f842fffbb2a7056ab11c256fe0f78af1e497f48d"} Apr 20 21:51:33.564173 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.563848 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f962aea3-92ba-43bf-849e-8aadf364cbd5","Type":"ContainerDied","Data":"79a9cf0d725b6f9563babdac3b5d37dbd8a596330df301c1a35fb8667473a565"} Apr 20 21:51:33.564173 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.563857 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f962aea3-92ba-43bf-849e-8aadf364cbd5","Type":"ContainerDied","Data":"92d9d326151fb965fa54fac8abe44afe11a7f64a31404c114e4b7d5f9bc5e0ef"} Apr 20 21:51:33.564173 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:33.563866 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f962aea3-92ba-43bf-849e-8aadf364cbd5","Type":"ContainerDied","Data":"115de3192c08defb35100c052035a871b3c0fe4999c57712b98a18911fe95bbe"} Apr 20 21:51:34.459475 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.459451 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:34.568801 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.568722 2566 generic.go:358] "Generic (PLEG): container finished" podID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerID="9908adc160c639ca59c577669011381e9272ae21cc0646c71ce482d49046f014" exitCode=0 Apr 20 21:51:34.568931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.568801 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f962aea3-92ba-43bf-849e-8aadf364cbd5","Type":"ContainerDied","Data":"9908adc160c639ca59c577669011381e9272ae21cc0646c71ce482d49046f014"} Apr 20 21:51:34.568931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.568824 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:34.568931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.568843 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f962aea3-92ba-43bf-849e-8aadf364cbd5","Type":"ContainerDied","Data":"91596c2c08db2580b30897d2058568179a6d8d6278b6ce4c91c7c7f32d287a3e"} Apr 20 21:51:34.568931 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.568861 2566 scope.go:117] "RemoveContainer" containerID="ec50d5a72263e8031ea2d32ea1f91c954f84436e27537f9edf1a6d1e01c58bc1" Apr 20 21:51:34.575777 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.575761 2566 scope.go:117] "RemoveContainer" containerID="1407867ce04512a9689b6c18f842fffbb2a7056ab11c256fe0f78af1e497f48d" Apr 20 21:51:34.581986 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.581968 2566 scope.go:117] "RemoveContainer" containerID="79a9cf0d725b6f9563babdac3b5d37dbd8a596330df301c1a35fb8667473a565" Apr 20 21:51:34.588253 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.588238 2566 scope.go:117] "RemoveContainer" containerID="9908adc160c639ca59c577669011381e9272ae21cc0646c71ce482d49046f014" Apr 20 21:51:34.590291 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.590263 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f962aea3-92ba-43bf-849e-8aadf364cbd5-alertmanager-main-db\") pod \"f962aea3-92ba-43bf-849e-8aadf364cbd5\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " Apr 20 21:51:34.590378 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.590307 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-config-volume\") pod \"f962aea3-92ba-43bf-849e-8aadf364cbd5\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " Apr 20 21:51:34.590378 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.590337 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f962aea3-92ba-43bf-849e-8aadf364cbd5-tls-assets\") pod \"f962aea3-92ba-43bf-849e-8aadf364cbd5\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " Apr 20 21:51:34.590378 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.590368 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f962aea3-92ba-43bf-849e-8aadf364cbd5-alertmanager-trusted-ca-bundle\") pod \"f962aea3-92ba-43bf-849e-8aadf364cbd5\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " Apr 20 21:51:34.590530 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.590399 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-main-tls\") pod \"f962aea3-92ba-43bf-849e-8aadf364cbd5\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " Apr 20 21:51:34.590530 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.590517 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2pfb\" (UniqueName: \"kubernetes.io/projected/f962aea3-92ba-43bf-849e-8aadf364cbd5-kube-api-access-s2pfb\") pod \"f962aea3-92ba-43bf-849e-8aadf364cbd5\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " Apr 20 21:51:34.590603 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.590577 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f962aea3-92ba-43bf-849e-8aadf364cbd5-metrics-client-ca\") pod \"f962aea3-92ba-43bf-849e-8aadf364cbd5\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " Apr 20 21:51:34.590645 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.590614 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-cluster-tls-config\") pod \"f962aea3-92ba-43bf-849e-8aadf364cbd5\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " Apr 20 21:51:34.590686 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.590650 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-web-config\") pod \"f962aea3-92ba-43bf-849e-8aadf364cbd5\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " Apr 20 21:51:34.590738 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.590685 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy-web\") pod \"f962aea3-92ba-43bf-849e-8aadf364cbd5\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " Apr 20 21:51:34.590841 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.590801 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f962aea3-92ba-43bf-849e-8aadf364cbd5-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "f962aea3-92ba-43bf-849e-8aadf364cbd5" (UID: "f962aea3-92ba-43bf-849e-8aadf364cbd5"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:51:34.591027 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.590957 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f962aea3-92ba-43bf-849e-8aadf364cbd5-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "f962aea3-92ba-43bf-849e-8aadf364cbd5" (UID: "f962aea3-92ba-43bf-849e-8aadf364cbd5"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:51:34.591027 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.590604 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f962aea3-92ba-43bf-849e-8aadf364cbd5-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "f962aea3-92ba-43bf-849e-8aadf364cbd5" (UID: "f962aea3-92ba-43bf-849e-8aadf364cbd5"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:51:34.591169 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.591087 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f962aea3-92ba-43bf-849e-8aadf364cbd5-config-out\") pod \"f962aea3-92ba-43bf-849e-8aadf364cbd5\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " Apr 20 21:51:34.591169 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.591137 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"f962aea3-92ba-43bf-849e-8aadf364cbd5\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " Apr 20 21:51:34.591274 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.591172 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy\") pod \"f962aea3-92ba-43bf-849e-8aadf364cbd5\" (UID: \"f962aea3-92ba-43bf-849e-8aadf364cbd5\") " Apr 20 21:51:34.591784 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.591453 2566 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f962aea3-92ba-43bf-849e-8aadf364cbd5-metrics-client-ca\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:51:34.591784 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.591477 2566 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f962aea3-92ba-43bf-849e-8aadf364cbd5-alertmanager-main-db\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:51:34.591784 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.591495 2566 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f962aea3-92ba-43bf-849e-8aadf364cbd5-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:51:34.594156 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.594102 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f962aea3-92ba-43bf-849e-8aadf364cbd5-kube-api-access-s2pfb" (OuterVolumeSpecName: "kube-api-access-s2pfb") pod "f962aea3-92ba-43bf-849e-8aadf364cbd5" (UID: "f962aea3-92ba-43bf-849e-8aadf364cbd5"). InnerVolumeSpecName "kube-api-access-s2pfb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:51:34.594544 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.594515 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "f962aea3-92ba-43bf-849e-8aadf364cbd5" (UID: "f962aea3-92ba-43bf-849e-8aadf364cbd5"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:51:34.594645 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.594544 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "f962aea3-92ba-43bf-849e-8aadf364cbd5" (UID: "f962aea3-92ba-43bf-849e-8aadf364cbd5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:51:34.594739 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.594719 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f962aea3-92ba-43bf-849e-8aadf364cbd5-config-out" (OuterVolumeSpecName: "config-out") pod "f962aea3-92ba-43bf-849e-8aadf364cbd5" (UID: "f962aea3-92ba-43bf-849e-8aadf364cbd5"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:51:34.594949 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.594889 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "f962aea3-92ba-43bf-849e-8aadf364cbd5" (UID: "f962aea3-92ba-43bf-849e-8aadf364cbd5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:51:34.595033 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.594964 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f962aea3-92ba-43bf-849e-8aadf364cbd5-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f962aea3-92ba-43bf-849e-8aadf364cbd5" (UID: "f962aea3-92ba-43bf-849e-8aadf364cbd5"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:51:34.595033 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.594999 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-config-volume" (OuterVolumeSpecName: "config-volume") pod "f962aea3-92ba-43bf-849e-8aadf364cbd5" (UID: "f962aea3-92ba-43bf-849e-8aadf364cbd5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:51:34.595532 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.595510 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "f962aea3-92ba-43bf-849e-8aadf364cbd5" (UID: "f962aea3-92ba-43bf-849e-8aadf364cbd5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:51:34.596418 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.596399 2566 scope.go:117] "RemoveContainer" containerID="92d9d326151fb965fa54fac8abe44afe11a7f64a31404c114e4b7d5f9bc5e0ef" Apr 20 21:51:34.599187 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.599157 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "f962aea3-92ba-43bf-849e-8aadf364cbd5" (UID: "f962aea3-92ba-43bf-849e-8aadf364cbd5"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:51:34.605631 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.605610 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-web-config" (OuterVolumeSpecName: "web-config") pod "f962aea3-92ba-43bf-849e-8aadf364cbd5" (UID: "f962aea3-92ba-43bf-849e-8aadf364cbd5"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:51:34.611138 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.611121 2566 scope.go:117] "RemoveContainer" containerID="115de3192c08defb35100c052035a871b3c0fe4999c57712b98a18911fe95bbe" Apr 20 21:51:34.617259 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.617244 2566 scope.go:117] "RemoveContainer" containerID="8b774505bde89345f97fe8d9c125c77d65e0c3a8348a59503f2bb85ee2826d11" Apr 20 21:51:34.623292 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.623262 2566 scope.go:117] "RemoveContainer" containerID="ec50d5a72263e8031ea2d32ea1f91c954f84436e27537f9edf1a6d1e01c58bc1" Apr 20 21:51:34.623542 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:51:34.623525 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec50d5a72263e8031ea2d32ea1f91c954f84436e27537f9edf1a6d1e01c58bc1\": container with ID starting with ec50d5a72263e8031ea2d32ea1f91c954f84436e27537f9edf1a6d1e01c58bc1 not found: ID does not exist" containerID="ec50d5a72263e8031ea2d32ea1f91c954f84436e27537f9edf1a6d1e01c58bc1" Apr 20 21:51:34.623589 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.623550 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec50d5a72263e8031ea2d32ea1f91c954f84436e27537f9edf1a6d1e01c58bc1"} err="failed to get container status \"ec50d5a72263e8031ea2d32ea1f91c954f84436e27537f9edf1a6d1e01c58bc1\": rpc error: code = NotFound desc = could not find container \"ec50d5a72263e8031ea2d32ea1f91c954f84436e27537f9edf1a6d1e01c58bc1\": container with ID starting with ec50d5a72263e8031ea2d32ea1f91c954f84436e27537f9edf1a6d1e01c58bc1 not found: ID does not exist" Apr 20 21:51:34.623589 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.623569 2566 scope.go:117] "RemoveContainer" containerID="1407867ce04512a9689b6c18f842fffbb2a7056ab11c256fe0f78af1e497f48d" Apr 20 21:51:34.623805 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:51:34.623789 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1407867ce04512a9689b6c18f842fffbb2a7056ab11c256fe0f78af1e497f48d\": container with ID starting with 1407867ce04512a9689b6c18f842fffbb2a7056ab11c256fe0f78af1e497f48d not found: ID does not exist" containerID="1407867ce04512a9689b6c18f842fffbb2a7056ab11c256fe0f78af1e497f48d" Apr 20 21:51:34.623849 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.623812 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1407867ce04512a9689b6c18f842fffbb2a7056ab11c256fe0f78af1e497f48d"} err="failed to get container status \"1407867ce04512a9689b6c18f842fffbb2a7056ab11c256fe0f78af1e497f48d\": rpc error: code = NotFound desc = could not find container \"1407867ce04512a9689b6c18f842fffbb2a7056ab11c256fe0f78af1e497f48d\": container with ID starting with 1407867ce04512a9689b6c18f842fffbb2a7056ab11c256fe0f78af1e497f48d not found: ID does not exist" Apr 20 21:51:34.623849 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.623827 2566 scope.go:117] "RemoveContainer" containerID="79a9cf0d725b6f9563babdac3b5d37dbd8a596330df301c1a35fb8667473a565" Apr 20 21:51:34.624040 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:51:34.624023 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79a9cf0d725b6f9563babdac3b5d37dbd8a596330df301c1a35fb8667473a565\": container with ID starting with 79a9cf0d725b6f9563babdac3b5d37dbd8a596330df301c1a35fb8667473a565 not found: ID does not exist" containerID="79a9cf0d725b6f9563babdac3b5d37dbd8a596330df301c1a35fb8667473a565" Apr 20 21:51:34.624078 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.624046 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a9cf0d725b6f9563babdac3b5d37dbd8a596330df301c1a35fb8667473a565"} err="failed to get container status \"79a9cf0d725b6f9563babdac3b5d37dbd8a596330df301c1a35fb8667473a565\": rpc error: code = NotFound desc = could not find container \"79a9cf0d725b6f9563babdac3b5d37dbd8a596330df301c1a35fb8667473a565\": container with ID starting with 79a9cf0d725b6f9563babdac3b5d37dbd8a596330df301c1a35fb8667473a565 not found: ID does not exist" Apr 20 21:51:34.624078 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.624063 2566 scope.go:117] "RemoveContainer" containerID="9908adc160c639ca59c577669011381e9272ae21cc0646c71ce482d49046f014" Apr 20 21:51:34.624289 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:51:34.624264 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9908adc160c639ca59c577669011381e9272ae21cc0646c71ce482d49046f014\": container with ID starting with 9908adc160c639ca59c577669011381e9272ae21cc0646c71ce482d49046f014 not found: ID does not exist" containerID="9908adc160c639ca59c577669011381e9272ae21cc0646c71ce482d49046f014" Apr 20 21:51:34.624335 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.624304 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9908adc160c639ca59c577669011381e9272ae21cc0646c71ce482d49046f014"} err="failed to get container status \"9908adc160c639ca59c577669011381e9272ae21cc0646c71ce482d49046f014\": rpc error: code = NotFound desc = could not find container \"9908adc160c639ca59c577669011381e9272ae21cc0646c71ce482d49046f014\": container with ID starting with 9908adc160c639ca59c577669011381e9272ae21cc0646c71ce482d49046f014 not found: ID does not exist" Apr 20 21:51:34.624335 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.624319 2566 scope.go:117] "RemoveContainer" containerID="92d9d326151fb965fa54fac8abe44afe11a7f64a31404c114e4b7d5f9bc5e0ef" Apr 20 21:51:34.624540 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:51:34.624523 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92d9d326151fb965fa54fac8abe44afe11a7f64a31404c114e4b7d5f9bc5e0ef\": container with ID starting with 92d9d326151fb965fa54fac8abe44afe11a7f64a31404c114e4b7d5f9bc5e0ef not found: ID does not exist" containerID="92d9d326151fb965fa54fac8abe44afe11a7f64a31404c114e4b7d5f9bc5e0ef" Apr 20 21:51:34.624604 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.624547 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92d9d326151fb965fa54fac8abe44afe11a7f64a31404c114e4b7d5f9bc5e0ef"} err="failed to get container status \"92d9d326151fb965fa54fac8abe44afe11a7f64a31404c114e4b7d5f9bc5e0ef\": rpc error: code = NotFound desc = could not find container \"92d9d326151fb965fa54fac8abe44afe11a7f64a31404c114e4b7d5f9bc5e0ef\": container with ID starting with 92d9d326151fb965fa54fac8abe44afe11a7f64a31404c114e4b7d5f9bc5e0ef not found: ID does not exist" Apr 20 21:51:34.624604 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.624569 2566 scope.go:117] "RemoveContainer" containerID="115de3192c08defb35100c052035a871b3c0fe4999c57712b98a18911fe95bbe" Apr 20 21:51:34.624777 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:51:34.624761 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"115de3192c08defb35100c052035a871b3c0fe4999c57712b98a18911fe95bbe\": container with ID starting with 115de3192c08defb35100c052035a871b3c0fe4999c57712b98a18911fe95bbe not found: ID does not exist" containerID="115de3192c08defb35100c052035a871b3c0fe4999c57712b98a18911fe95bbe" Apr 20 21:51:34.624819 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.624781 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"115de3192c08defb35100c052035a871b3c0fe4999c57712b98a18911fe95bbe"} err="failed to get container status \"115de3192c08defb35100c052035a871b3c0fe4999c57712b98a18911fe95bbe\": rpc error: code = NotFound desc = could not find container \"115de3192c08defb35100c052035a871b3c0fe4999c57712b98a18911fe95bbe\": container with ID starting with 115de3192c08defb35100c052035a871b3c0fe4999c57712b98a18911fe95bbe not found: ID does not exist" Apr 20 21:51:34.624819 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.624795 2566 scope.go:117] "RemoveContainer" containerID="8b774505bde89345f97fe8d9c125c77d65e0c3a8348a59503f2bb85ee2826d11" Apr 20 21:51:34.624982 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:51:34.624967 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b774505bde89345f97fe8d9c125c77d65e0c3a8348a59503f2bb85ee2826d11\": container with ID starting with 8b774505bde89345f97fe8d9c125c77d65e0c3a8348a59503f2bb85ee2826d11 not found: ID does not exist" containerID="8b774505bde89345f97fe8d9c125c77d65e0c3a8348a59503f2bb85ee2826d11" Apr 20 21:51:34.625019 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.624984 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b774505bde89345f97fe8d9c125c77d65e0c3a8348a59503f2bb85ee2826d11"} err="failed to get container status \"8b774505bde89345f97fe8d9c125c77d65e0c3a8348a59503f2bb85ee2826d11\": rpc error: code = NotFound desc = could not find container \"8b774505bde89345f97fe8d9c125c77d65e0c3a8348a59503f2bb85ee2826d11\": container with ID starting with 8b774505bde89345f97fe8d9c125c77d65e0c3a8348a59503f2bb85ee2826d11 not found: ID does not exist" Apr 20 21:51:34.692450 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.692409 2566 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-config-volume\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:51:34.692450 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.692447 2566 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f962aea3-92ba-43bf-849e-8aadf364cbd5-tls-assets\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:51:34.692450 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.692456 2566 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-main-tls\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:51:34.692630 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.692465 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s2pfb\" (UniqueName: \"kubernetes.io/projected/f962aea3-92ba-43bf-849e-8aadf364cbd5-kube-api-access-s2pfb\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:51:34.692630 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.692475 2566 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-cluster-tls-config\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:51:34.692630 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.692483 2566 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-web-config\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:51:34.692630 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.692495 2566 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:51:34.692630 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.692508 2566 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f962aea3-92ba-43bf-849e-8aadf364cbd5-config-out\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:51:34.692630 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.692518 2566 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:51:34.692630 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.692526 2566 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f962aea3-92ba-43bf-849e-8aadf364cbd5-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:51:34.892646 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.892597 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:51:34.896968 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.896933 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:51:34.921626 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921601 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:51:34.921881 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921869 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="kube-rbac-proxy-metric" Apr 20 21:51:34.921923 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921883 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="kube-rbac-proxy-metric" Apr 20 21:51:34.921923 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921897 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="kube-rbac-proxy" Apr 20 21:51:34.921923 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921906 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="kube-rbac-proxy" Apr 20 21:51:34.921923 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921917 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de9ba6bd-1329-40c2-b819-7dce1bdd20f0" containerName="registry" Apr 20 21:51:34.921923 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921922 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="de9ba6bd-1329-40c2-b819-7dce1bdd20f0" containerName="registry" Apr 20 21:51:34.922062 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921928 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="alertmanager" Apr 20 21:51:34.922062 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921934 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="alertmanager" Apr 20 21:51:34.922062 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921940 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="config-reloader" Apr 20 21:51:34.922062 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921958 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="config-reloader" Apr 20 21:51:34.922062 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921965 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aedeb3a7-3b0e-427c-92e3-efcf4c2cb772" containerName="console" Apr 20 21:51:34.922062 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921970 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="aedeb3a7-3b0e-427c-92e3-efcf4c2cb772" containerName="console" Apr 20 21:51:34.922062 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921977 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="prom-label-proxy" Apr 20 21:51:34.922062 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921987 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="prom-label-proxy" Apr 20 21:51:34.922062 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.921996 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="kube-rbac-proxy-web" Apr 20 21:51:34.922062 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.922002 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="kube-rbac-proxy-web" Apr 20 21:51:34.922062 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.922012 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bef215f-5a33-4576-950c-039084f6b70e" containerName="console" Apr 20 21:51:34.922062 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.922017 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bef215f-5a33-4576-950c-039084f6b70e" containerName="console" Apr 20 21:51:34.922062 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.922023 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="init-config-reloader" Apr 20 21:51:34.922062 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.922028 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="init-config-reloader" Apr 20 21:51:34.922509 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.922074 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="de9ba6bd-1329-40c2-b819-7dce1bdd20f0" containerName="registry" Apr 20 21:51:34.922509 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.922083 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="aedeb3a7-3b0e-427c-92e3-efcf4c2cb772" containerName="console" Apr 20 21:51:34.922509 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.922090 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="kube-rbac-proxy-metric" Apr 20 21:51:34.922509 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.922098 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="kube-rbac-proxy-web" Apr 20 21:51:34.922509 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.922105 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="7bef215f-5a33-4576-950c-039084f6b70e" containerName="console" Apr 20 21:51:34.922509 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.922111 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="kube-rbac-proxy" Apr 20 21:51:34.922509 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.922118 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="prom-label-proxy" Apr 20 21:51:34.922509 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.922123 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="config-reloader" Apr 20 21:51:34.922509 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.922127 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" containerName="alertmanager" Apr 20 21:51:34.926947 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.926930 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:34.930207 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.930066 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 21:51:34.930346 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.930328 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 21:51:34.930499 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.930452 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 21:51:34.930499 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.930475 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 21:51:34.930499 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.930480 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 21:51:34.930499 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.930559 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 21:51:34.930499 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.930762 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-l4nr9\"" Apr 20 21:51:34.930499 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.930776 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 21:51:34.930499 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.930828 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 21:51:34.939808 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.939584 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 21:51:34.940324 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:34.940215 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:51:35.096356 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.096322 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-web-config\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.096540 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.096381 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.096540 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.096447 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e2452fa3-65aa-4e20-9f82-494b57157bab-config-out\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.096540 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.096493 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.096540 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.096516 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2452fa3-65aa-4e20-9f82-494b57157bab-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.096676 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.096544 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e2452fa3-65aa-4e20-9f82-494b57157bab-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.096676 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.096561 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms7kr\" (UniqueName: \"kubernetes.io/projected/e2452fa3-65aa-4e20-9f82-494b57157bab-kube-api-access-ms7kr\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.096676 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.096590 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.096676 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.096615 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.096676 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.096667 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.096822 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.096700 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-config-volume\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.096822 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.096726 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2452fa3-65aa-4e20-9f82-494b57157bab-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.096822 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.096751 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e2452fa3-65aa-4e20-9f82-494b57157bab-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.197494 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.197420 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-web-config\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.197494 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.197468 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.197666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.197498 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e2452fa3-65aa-4e20-9f82-494b57157bab-config-out\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.197666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.197523 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.197666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.197541 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2452fa3-65aa-4e20-9f82-494b57157bab-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.197666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.197564 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e2452fa3-65aa-4e20-9f82-494b57157bab-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.197666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.197590 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ms7kr\" (UniqueName: \"kubernetes.io/projected/e2452fa3-65aa-4e20-9f82-494b57157bab-kube-api-access-ms7kr\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.197666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.197619 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.197666 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.197648 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.198129 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.198083 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e2452fa3-65aa-4e20-9f82-494b57157bab-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.198687 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.198519 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2452fa3-65aa-4e20-9f82-494b57157bab-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.198978 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.198908 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.199502 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.199439 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-config-volume\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.201040 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.200322 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2452fa3-65aa-4e20-9f82-494b57157bab-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.201040 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.200574 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e2452fa3-65aa-4e20-9f82-494b57157bab-config-out\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.201040 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.200741 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.201040 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.200891 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.201040 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.200975 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.201040 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.200984 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.201333 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.201081 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2452fa3-65aa-4e20-9f82-494b57157bab-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.201333 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.201111 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-web-config\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.201333 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.201143 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e2452fa3-65aa-4e20-9f82-494b57157bab-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.201511 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.201491 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.201557 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.201540 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e2452fa3-65aa-4e20-9f82-494b57157bab-config-volume\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.202956 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.202939 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e2452fa3-65aa-4e20-9f82-494b57157bab-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.205247 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.205226 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms7kr\" (UniqueName: \"kubernetes.io/projected/e2452fa3-65aa-4e20-9f82-494b57157bab-kube-api-access-ms7kr\") pod \"alertmanager-main-0\" (UID: \"e2452fa3-65aa-4e20-9f82-494b57157bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.242825 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.242796 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:51:35.363172 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.363148 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:51:35.365344 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:51:35.365310 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2452fa3_65aa_4e20_9f82_494b57157bab.slice/crio-4d79199e6c80ad288c7fa6b77d7c330fe9ed1cdc55b9434c35ac8ecddc9f7e5f WatchSource:0}: Error finding container 4d79199e6c80ad288c7fa6b77d7c330fe9ed1cdc55b9434c35ac8ecddc9f7e5f: Status 404 returned error can't find the container with id 4d79199e6c80ad288c7fa6b77d7c330fe9ed1cdc55b9434c35ac8ecddc9f7e5f Apr 20 21:51:35.574218 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.574184 2566 generic.go:358] "Generic (PLEG): container finished" podID="e2452fa3-65aa-4e20-9f82-494b57157bab" containerID="2bdcb048012237b61b24030032ce057356ab450cd6556514b9e506530def4701" exitCode=0 Apr 20 21:51:35.574575 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.574256 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2452fa3-65aa-4e20-9f82-494b57157bab","Type":"ContainerDied","Data":"2bdcb048012237b61b24030032ce057356ab450cd6556514b9e506530def4701"} Apr 20 21:51:35.574575 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.574300 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2452fa3-65aa-4e20-9f82-494b57157bab","Type":"ContainerStarted","Data":"4d79199e6c80ad288c7fa6b77d7c330fe9ed1cdc55b9434c35ac8ecddc9f7e5f"} Apr 20 21:51:35.773143 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:35.773115 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f962aea3-92ba-43bf-849e-8aadf364cbd5" path="/var/lib/kubelet/pods/f962aea3-92ba-43bf-849e-8aadf364cbd5/volumes" Apr 20 21:51:36.580343 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:36.580251 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2452fa3-65aa-4e20-9f82-494b57157bab","Type":"ContainerStarted","Data":"7738e17995387ea2ea502cb9ddadde154d73566f845064fdc32323d6163ceb7a"} Apr 20 21:51:36.580343 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:36.580301 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2452fa3-65aa-4e20-9f82-494b57157bab","Type":"ContainerStarted","Data":"e6127e8f6e5346ce9ed6d5ea3c0f654ddda219e651970a0af18798c5659b9575"} Apr 20 21:51:36.580343 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:36.580313 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2452fa3-65aa-4e20-9f82-494b57157bab","Type":"ContainerStarted","Data":"1931f0e8a91009683329810cc85a91646078e0c437f950284529b9dd5a9bca6a"} Apr 20 21:51:36.580343 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:36.580322 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2452fa3-65aa-4e20-9f82-494b57157bab","Type":"ContainerStarted","Data":"3187f15b2a99240e5531084cd0366f175bf19d429d8ae4bef079571ae108a082"} Apr 20 21:51:36.580343 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:36.580330 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2452fa3-65aa-4e20-9f82-494b57157bab","Type":"ContainerStarted","Data":"68eb8b955d40dcc4cac0ac2746e9990ac6901f0c600b86a35c2bf3420caba55e"} Apr 20 21:51:36.580343 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:36.580338 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2452fa3-65aa-4e20-9f82-494b57157bab","Type":"ContainerStarted","Data":"7bdd40ef64937c67473a39986320eca18de32e9ce62a51408a4042ad8ef1b834"} Apr 20 21:51:36.608601 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:36.608528 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.6085112219999997 podStartE2EDuration="2.608511222s" podCreationTimestamp="2026-04-20 21:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:51:36.607733595 +0000 UTC m=+265.416420972" watchObservedRunningTime="2026-04-20 21:51:36.608511222 +0000 UTC m=+265.417198588" Apr 20 21:51:37.233469 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.233437 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5775b46cc4-lwprx"] Apr 20 21:51:37.236934 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.236906 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.239509 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.239490 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 21:51:37.239703 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.239492 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 21:51:37.239703 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.239552 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-m74v6\"" Apr 20 21:51:37.239849 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.239792 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 21:51:37.240115 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.240091 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 21:51:37.240183 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.240137 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 21:51:37.248250 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.248216 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5775b46cc4-lwprx"] Apr 20 21:51:37.252439 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.251006 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 21:51:37.319013 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.318980 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ca4dd1d5-9a28-42a0-b615-9ab984977a89-telemeter-client-tls\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.319013 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.319014 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ca4dd1d5-9a28-42a0-b615-9ab984977a89-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.319208 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.319037 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca4dd1d5-9a28-42a0-b615-9ab984977a89-metrics-client-ca\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.319208 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.319097 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ca4dd1d5-9a28-42a0-b615-9ab984977a89-federate-client-tls\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.319208 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.319172 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v2gw\" (UniqueName: \"kubernetes.io/projected/ca4dd1d5-9a28-42a0-b615-9ab984977a89-kube-api-access-6v2gw\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.319327 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.319216 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca4dd1d5-9a28-42a0-b615-9ab984977a89-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.319327 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.319243 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ca4dd1d5-9a28-42a0-b615-9ab984977a89-secret-telemeter-client\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.319327 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.319259 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca4dd1d5-9a28-42a0-b615-9ab984977a89-serving-certs-ca-bundle\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.420564 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.420514 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca4dd1d5-9a28-42a0-b615-9ab984977a89-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.420768 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.420585 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ca4dd1d5-9a28-42a0-b615-9ab984977a89-secret-telemeter-client\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.420768 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.420612 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca4dd1d5-9a28-42a0-b615-9ab984977a89-serving-certs-ca-bundle\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.420768 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.420668 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ca4dd1d5-9a28-42a0-b615-9ab984977a89-telemeter-client-tls\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.420768 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.420700 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ca4dd1d5-9a28-42a0-b615-9ab984977a89-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.420768 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.420737 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca4dd1d5-9a28-42a0-b615-9ab984977a89-metrics-client-ca\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.420997 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.420774 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ca4dd1d5-9a28-42a0-b615-9ab984977a89-federate-client-tls\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.420997 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.420801 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6v2gw\" (UniqueName: \"kubernetes.io/projected/ca4dd1d5-9a28-42a0-b615-9ab984977a89-kube-api-access-6v2gw\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.422148 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.422098 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca4dd1d5-9a28-42a0-b615-9ab984977a89-serving-certs-ca-bundle\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.422514 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.422485 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca4dd1d5-9a28-42a0-b615-9ab984977a89-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.422643 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.422488 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca4dd1d5-9a28-42a0-b615-9ab984977a89-metrics-client-ca\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.424046 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.424025 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ca4dd1d5-9a28-42a0-b615-9ab984977a89-telemeter-client-tls\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.424046 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.424033 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ca4dd1d5-9a28-42a0-b615-9ab984977a89-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.424294 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.424257 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ca4dd1d5-9a28-42a0-b615-9ab984977a89-secret-telemeter-client\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.424467 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.424443 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ca4dd1d5-9a28-42a0-b615-9ab984977a89-federate-client-tls\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.428595 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.428566 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v2gw\" (UniqueName: \"kubernetes.io/projected/ca4dd1d5-9a28-42a0-b615-9ab984977a89-kube-api-access-6v2gw\") pod \"telemeter-client-5775b46cc4-lwprx\" (UID: \"ca4dd1d5-9a28-42a0-b615-9ab984977a89\") " pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.558350 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.558246 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" Apr 20 21:51:37.693019 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:37.692984 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5775b46cc4-lwprx"] Apr 20 21:51:37.695140 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:51:37.695107 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca4dd1d5_9a28_42a0_b615_9ab984977a89.slice/crio-af2df9b89932bae83b55a0383478869d8a4d0a2a63ae1233fc808675f73267e3 WatchSource:0}: Error finding container af2df9b89932bae83b55a0383478869d8a4d0a2a63ae1233fc808675f73267e3: Status 404 returned error can't find the container with id af2df9b89932bae83b55a0383478869d8a4d0a2a63ae1233fc808675f73267e3 Apr 20 21:51:38.589440 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:38.589400 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" event={"ID":"ca4dd1d5-9a28-42a0-b615-9ab984977a89","Type":"ContainerStarted","Data":"af2df9b89932bae83b55a0383478869d8a4d0a2a63ae1233fc808675f73267e3"} Apr 20 21:51:40.598321 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:40.598268 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" event={"ID":"ca4dd1d5-9a28-42a0-b615-9ab984977a89","Type":"ContainerStarted","Data":"e2c055a89b736a742d5bb68bc899a385806c1badb57b852ac6b33b90d87f50da"} Apr 20 21:51:40.598699 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:40.598327 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" event={"ID":"ca4dd1d5-9a28-42a0-b615-9ab984977a89","Type":"ContainerStarted","Data":"30708e29d39444c499770d514e1d2e91b50e016dd87538960855968fb5890681"} Apr 20 21:51:40.598699 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:40.598343 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" event={"ID":"ca4dd1d5-9a28-42a0-b615-9ab984977a89","Type":"ContainerStarted","Data":"652c235d4b34a1261ed5a42554bd6f24ed3eac0bd315764f43d67e6f13b22650"} Apr 20 21:51:40.623820 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:40.623743 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5775b46cc4-lwprx" podStartSLOduration=1.778019091 podStartE2EDuration="3.623728844s" podCreationTimestamp="2026-04-20 21:51:37 +0000 UTC" firstStartedPulling="2026-04-20 21:51:37.697271814 +0000 UTC m=+266.505959159" lastFinishedPulling="2026-04-20 21:51:39.542981564 +0000 UTC m=+268.351668912" observedRunningTime="2026-04-20 21:51:40.621970306 +0000 UTC m=+269.430657681" watchObservedRunningTime="2026-04-20 21:51:40.623728844 +0000 UTC m=+269.432416209" Apr 20 21:51:41.509384 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.509347 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f75c646df-l4j42"] Apr 20 21:51:41.513166 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.513134 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.521817 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.521785 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f75c646df-l4j42"] Apr 20 21:51:41.560039 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.560006 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d419d99-ee09-4b6e-8cea-05462672af45-console-oauth-config\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.560170 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.560054 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-service-ca\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.560170 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.560150 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-oauth-serving-cert\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.560260 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.560228 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d419d99-ee09-4b6e-8cea-05462672af45-console-serving-cert\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.560322 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.560257 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-console-config\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.560322 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.560302 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-trusted-ca-bundle\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.560400 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.560328 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r42tp\" (UniqueName: \"kubernetes.io/projected/2d419d99-ee09-4b6e-8cea-05462672af45-kube-api-access-r42tp\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.660848 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.660809 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-trusted-ca-bundle\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.660848 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.660847 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r42tp\" (UniqueName: \"kubernetes.io/projected/2d419d99-ee09-4b6e-8cea-05462672af45-kube-api-access-r42tp\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.661314 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.661018 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d419d99-ee09-4b6e-8cea-05462672af45-console-oauth-config\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.661314 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.661129 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-service-ca\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.661314 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.661211 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-oauth-serving-cert\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.661497 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.661345 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d419d99-ee09-4b6e-8cea-05462672af45-console-serving-cert\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.661497 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.661414 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-console-config\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.661831 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.661800 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-trusted-ca-bundle\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.661930 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.661905 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-service-ca\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.662086 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.662069 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-console-config\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.662457 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.662436 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-oauth-serving-cert\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.663534 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.663514 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d419d99-ee09-4b6e-8cea-05462672af45-console-oauth-config\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.663815 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.663796 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d419d99-ee09-4b6e-8cea-05462672af45-console-serving-cert\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.671516 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.671494 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r42tp\" (UniqueName: \"kubernetes.io/projected/2d419d99-ee09-4b6e-8cea-05462672af45-kube-api-access-r42tp\") pod \"console-5f75c646df-l4j42\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.823251 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.823164 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:41.938403 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:41.938359 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f75c646df-l4j42"] Apr 20 21:51:41.941172 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:51:41.941131 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d419d99_ee09_4b6e_8cea_05462672af45.slice/crio-11e89faa5654cce552797a4e335010eb179f1735a3ab579e916312871d4ef2ee WatchSource:0}: Error finding container 11e89faa5654cce552797a4e335010eb179f1735a3ab579e916312871d4ef2ee: Status 404 returned error can't find the container with id 11e89faa5654cce552797a4e335010eb179f1735a3ab579e916312871d4ef2ee Apr 20 21:51:42.606449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:42.606410 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f75c646df-l4j42" event={"ID":"2d419d99-ee09-4b6e-8cea-05462672af45","Type":"ContainerStarted","Data":"268dd382d08beaaa9895038b21f048a175682c441b0ed79fbfeb2c8c38cd2cfc"} Apr 20 21:51:42.606449 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:42.606445 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f75c646df-l4j42" event={"ID":"2d419d99-ee09-4b6e-8cea-05462672af45","Type":"ContainerStarted","Data":"11e89faa5654cce552797a4e335010eb179f1735a3ab579e916312871d4ef2ee"} Apr 20 21:51:42.623731 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:42.623677 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f75c646df-l4j42" podStartSLOduration=1.623658967 podStartE2EDuration="1.623658967s" podCreationTimestamp="2026-04-20 21:51:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:51:42.62273361 +0000 UTC m=+271.431421156" watchObservedRunningTime="2026-04-20 21:51:42.623658967 +0000 UTC m=+271.432346333" Apr 20 21:51:51.824260 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:51.824227 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:51.824260 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:51.824259 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:51.828848 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:51.828824 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:52.640062 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:52.640035 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:51:52.684614 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:51:52.684579 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5475666778-27zs5"] Apr 20 21:52:11.669907 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:11.669881 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 21:52:11.670504 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:11.669970 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 21:52:11.673875 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:11.673854 2566 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 21:52:17.705607 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:17.705537 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5475666778-27zs5" podUID="a628e407-40cc-4e12-b139-946e014e4c8e" containerName="console" containerID="cri-o://4ea7a9b251a9530e2a211e31b4488167fea7a5934fd70fc1e488484f70834fef" gracePeriod=15 Apr 20 21:52:17.947730 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:17.947709 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5475666778-27zs5_a628e407-40cc-4e12-b139-946e014e4c8e/console/0.log" Apr 20 21:52:17.947843 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:17.947768 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5475666778-27zs5" Apr 20 21:52:18.056841 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.056813 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-trusted-ca-bundle\") pod \"a628e407-40cc-4e12-b139-946e014e4c8e\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " Apr 20 21:52:18.057018 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.056847 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a628e407-40cc-4e12-b139-946e014e4c8e-console-serving-cert\") pod \"a628e407-40cc-4e12-b139-946e014e4c8e\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " Apr 20 21:52:18.057018 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.056906 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-oauth-serving-cert\") pod \"a628e407-40cc-4e12-b139-946e014e4c8e\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " Apr 20 21:52:18.057018 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.056932 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a628e407-40cc-4e12-b139-946e014e4c8e-console-oauth-config\") pod \"a628e407-40cc-4e12-b139-946e014e4c8e\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " Apr 20 21:52:18.057018 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.056964 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-console-config\") pod \"a628e407-40cc-4e12-b139-946e014e4c8e\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " Apr 20 21:52:18.057229 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.057027 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54rlx\" (UniqueName: \"kubernetes.io/projected/a628e407-40cc-4e12-b139-946e014e4c8e-kube-api-access-54rlx\") pod \"a628e407-40cc-4e12-b139-946e014e4c8e\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " Apr 20 21:52:18.057229 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.057055 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-service-ca\") pod \"a628e407-40cc-4e12-b139-946e014e4c8e\" (UID: \"a628e407-40cc-4e12-b139-946e014e4c8e\") " Apr 20 21:52:18.057371 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.057348 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a628e407-40cc-4e12-b139-946e014e4c8e" (UID: "a628e407-40cc-4e12-b139-946e014e4c8e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:52:18.057632 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.057527 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-service-ca" (OuterVolumeSpecName: "service-ca") pod "a628e407-40cc-4e12-b139-946e014e4c8e" (UID: "a628e407-40cc-4e12-b139-946e014e4c8e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:52:18.057710 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.057663 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-console-config" (OuterVolumeSpecName: "console-config") pod "a628e407-40cc-4e12-b139-946e014e4c8e" (UID: "a628e407-40cc-4e12-b139-946e014e4c8e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:52:18.057710 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.057683 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a628e407-40cc-4e12-b139-946e014e4c8e" (UID: "a628e407-40cc-4e12-b139-946e014e4c8e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:52:18.059193 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.059166 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a628e407-40cc-4e12-b139-946e014e4c8e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a628e407-40cc-4e12-b139-946e014e4c8e" (UID: "a628e407-40cc-4e12-b139-946e014e4c8e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:52:18.059312 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.059205 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a628e407-40cc-4e12-b139-946e014e4c8e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a628e407-40cc-4e12-b139-946e014e4c8e" (UID: "a628e407-40cc-4e12-b139-946e014e4c8e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:52:18.059312 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.059271 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a628e407-40cc-4e12-b139-946e014e4c8e-kube-api-access-54rlx" (OuterVolumeSpecName: "kube-api-access-54rlx") pod "a628e407-40cc-4e12-b139-946e014e4c8e" (UID: "a628e407-40cc-4e12-b139-946e014e4c8e"). InnerVolumeSpecName "kube-api-access-54rlx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:52:18.158319 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.158296 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-oauth-serving-cert\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:52:18.158319 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.158315 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a628e407-40cc-4e12-b139-946e014e4c8e-console-oauth-config\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:52:18.158458 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.158326 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-console-config\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:52:18.158458 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.158335 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-54rlx\" (UniqueName: \"kubernetes.io/projected/a628e407-40cc-4e12-b139-946e014e4c8e-kube-api-access-54rlx\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:52:18.158458 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.158344 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-service-ca\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:52:18.158458 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.158353 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a628e407-40cc-4e12-b139-946e014e4c8e-trusted-ca-bundle\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:52:18.158458 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.158361 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a628e407-40cc-4e12-b139-946e014e4c8e-console-serving-cert\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:52:18.712087 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.712062 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5475666778-27zs5_a628e407-40cc-4e12-b139-946e014e4c8e/console/0.log" Apr 20 21:52:18.712569 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.712099 2566 generic.go:358] "Generic (PLEG): container finished" podID="a628e407-40cc-4e12-b139-946e014e4c8e" containerID="4ea7a9b251a9530e2a211e31b4488167fea7a5934fd70fc1e488484f70834fef" exitCode=2 Apr 20 21:52:18.712569 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.712134 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5475666778-27zs5" event={"ID":"a628e407-40cc-4e12-b139-946e014e4c8e","Type":"ContainerDied","Data":"4ea7a9b251a9530e2a211e31b4488167fea7a5934fd70fc1e488484f70834fef"} Apr 20 21:52:18.712569 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.712156 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5475666778-27zs5" event={"ID":"a628e407-40cc-4e12-b139-946e014e4c8e","Type":"ContainerDied","Data":"6577ab426fa336ec1eb92be95411bec8fc315856d10c6851c847d312813aa09b"} Apr 20 21:52:18.712569 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.712171 2566 scope.go:117] "RemoveContainer" containerID="4ea7a9b251a9530e2a211e31b4488167fea7a5934fd70fc1e488484f70834fef" Apr 20 21:52:18.712569 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.712179 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5475666778-27zs5" Apr 20 21:52:18.720095 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.719886 2566 scope.go:117] "RemoveContainer" containerID="4ea7a9b251a9530e2a211e31b4488167fea7a5934fd70fc1e488484f70834fef" Apr 20 21:52:18.720220 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:52:18.720190 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ea7a9b251a9530e2a211e31b4488167fea7a5934fd70fc1e488484f70834fef\": container with ID starting with 4ea7a9b251a9530e2a211e31b4488167fea7a5934fd70fc1e488484f70834fef not found: ID does not exist" containerID="4ea7a9b251a9530e2a211e31b4488167fea7a5934fd70fc1e488484f70834fef" Apr 20 21:52:18.720273 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.720230 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea7a9b251a9530e2a211e31b4488167fea7a5934fd70fc1e488484f70834fef"} err="failed to get container status \"4ea7a9b251a9530e2a211e31b4488167fea7a5934fd70fc1e488484f70834fef\": rpc error: code = NotFound desc = could not find container \"4ea7a9b251a9530e2a211e31b4488167fea7a5934fd70fc1e488484f70834fef\": container with ID starting with 4ea7a9b251a9530e2a211e31b4488167fea7a5934fd70fc1e488484f70834fef not found: ID does not exist" Apr 20 21:52:18.733840 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.733814 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5475666778-27zs5"] Apr 20 21:52:18.739191 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:18.739172 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5475666778-27zs5"] Apr 20 21:52:19.773472 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:19.773440 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a628e407-40cc-4e12-b139-946e014e4c8e" path="/var/lib/kubelet/pods/a628e407-40cc-4e12-b139-946e014e4c8e/volumes" Apr 20 21:52:53.559627 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.559597 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74f4fd8fbc-m7zxp"] Apr 20 21:52:53.560031 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.559937 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a628e407-40cc-4e12-b139-946e014e4c8e" containerName="console" Apr 20 21:52:53.560031 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.559949 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a628e407-40cc-4e12-b139-946e014e4c8e" containerName="console" Apr 20 21:52:53.560031 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.560016 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="a628e407-40cc-4e12-b139-946e014e4c8e" containerName="console" Apr 20 21:52:53.564091 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.564070 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.573558 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.573533 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74f4fd8fbc-m7zxp"] Apr 20 21:52:53.618457 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.618436 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-serving-cert\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.618571 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.618465 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-config\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.618571 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.618483 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-trusted-ca-bundle\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.618571 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.618518 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-oauth-config\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.618571 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.618533 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-oauth-serving-cert\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.618571 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.618560 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-service-ca\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.618779 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.618581 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf9ht\" (UniqueName: \"kubernetes.io/projected/abd53352-97ee-4fd9-86bf-f96dd62d3a92-kube-api-access-kf9ht\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.719795 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.719764 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-config\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.719795 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.719798 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-trusted-ca-bundle\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.719922 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.719824 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-oauth-config\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.719922 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.719847 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-oauth-serving-cert\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.719922 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.719885 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-service-ca\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.719922 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.719910 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kf9ht\" (UniqueName: \"kubernetes.io/projected/abd53352-97ee-4fd9-86bf-f96dd62d3a92-kube-api-access-kf9ht\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.720065 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.719961 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-serving-cert\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.720633 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.720602 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-config\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.720708 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.720602 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-oauth-serving-cert\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.720708 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.720663 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-service-ca\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.720994 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.720975 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-trusted-ca-bundle\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.722561 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.722538 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-oauth-config\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.722722 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.722701 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-serving-cert\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.727604 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.727575 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf9ht\" (UniqueName: \"kubernetes.io/projected/abd53352-97ee-4fd9-86bf-f96dd62d3a92-kube-api-access-kf9ht\") pod \"console-74f4fd8fbc-m7zxp\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.873329 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.873252 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:52:53.991896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.991872 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74f4fd8fbc-m7zxp"] Apr 20 21:52:53.993673 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:52:53.993644 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabd53352_97ee_4fd9_86bf_f96dd62d3a92.slice/crio-58d17e5c5da77054293aafef2138b6746d9a74a58e23a8164a87a9ebca2bfa84 WatchSource:0}: Error finding container 58d17e5c5da77054293aafef2138b6746d9a74a58e23a8164a87a9ebca2bfa84: Status 404 returned error can't find the container with id 58d17e5c5da77054293aafef2138b6746d9a74a58e23a8164a87a9ebca2bfa84 Apr 20 21:52:53.995497 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:53.995480 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:52:54.816328 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:54.816293 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74f4fd8fbc-m7zxp" event={"ID":"abd53352-97ee-4fd9-86bf-f96dd62d3a92","Type":"ContainerStarted","Data":"9591c4e4532a9965ce9cdb5b87b01d8420a486e98ab155d084d2de7fc8187723"} Apr 20 21:52:54.816328 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:54.816327 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74f4fd8fbc-m7zxp" event={"ID":"abd53352-97ee-4fd9-86bf-f96dd62d3a92","Type":"ContainerStarted","Data":"58d17e5c5da77054293aafef2138b6746d9a74a58e23a8164a87a9ebca2bfa84"} Apr 20 21:52:54.833680 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:52:54.833637 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74f4fd8fbc-m7zxp" podStartSLOduration=1.8336242029999998 podStartE2EDuration="1.833624203s" podCreationTimestamp="2026-04-20 21:52:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:52:54.831547571 +0000 UTC m=+343.640234970" watchObservedRunningTime="2026-04-20 21:52:54.833624203 +0000 UTC m=+343.642311568" Apr 20 21:53:03.874469 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:03.874394 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:53:03.874469 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:03.874436 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:53:03.878967 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:03.878947 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:53:04.850771 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:04.850738 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:53:04.886238 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:04.886206 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f75c646df-l4j42"] Apr 20 21:53:29.905189 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:29.905145 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5f75c646df-l4j42" podUID="2d419d99-ee09-4b6e-8cea-05462672af45" containerName="console" containerID="cri-o://268dd382d08beaaa9895038b21f048a175682c441b0ed79fbfeb2c8c38cd2cfc" gracePeriod=15 Apr 20 21:53:30.138970 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.138948 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f75c646df-l4j42_2d419d99-ee09-4b6e-8cea-05462672af45/console/0.log" Apr 20 21:53:30.139079 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.139006 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:53:30.195567 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.195498 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d419d99-ee09-4b6e-8cea-05462672af45-console-oauth-config\") pod \"2d419d99-ee09-4b6e-8cea-05462672af45\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " Apr 20 21:53:30.195686 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.195581 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r42tp\" (UniqueName: \"kubernetes.io/projected/2d419d99-ee09-4b6e-8cea-05462672af45-kube-api-access-r42tp\") pod \"2d419d99-ee09-4b6e-8cea-05462672af45\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " Apr 20 21:53:30.195686 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.195601 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-service-ca\") pod \"2d419d99-ee09-4b6e-8cea-05462672af45\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " Apr 20 21:53:30.195686 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.195615 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-console-config\") pod \"2d419d99-ee09-4b6e-8cea-05462672af45\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " Apr 20 21:53:30.195686 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.195636 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-trusted-ca-bundle\") pod \"2d419d99-ee09-4b6e-8cea-05462672af45\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " Apr 20 21:53:30.195686 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.195666 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d419d99-ee09-4b6e-8cea-05462672af45-console-serving-cert\") pod \"2d419d99-ee09-4b6e-8cea-05462672af45\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " Apr 20 21:53:30.195921 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.195834 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-oauth-serving-cert\") pod \"2d419d99-ee09-4b6e-8cea-05462672af45\" (UID: \"2d419d99-ee09-4b6e-8cea-05462672af45\") " Apr 20 21:53:30.196023 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.195992 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-service-ca" (OuterVolumeSpecName: "service-ca") pod "2d419d99-ee09-4b6e-8cea-05462672af45" (UID: "2d419d99-ee09-4b6e-8cea-05462672af45"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:53:30.196083 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.196037 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-console-config" (OuterVolumeSpecName: "console-config") pod "2d419d99-ee09-4b6e-8cea-05462672af45" (UID: "2d419d99-ee09-4b6e-8cea-05462672af45"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:53:30.196149 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.196124 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2d419d99-ee09-4b6e-8cea-05462672af45" (UID: "2d419d99-ee09-4b6e-8cea-05462672af45"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:53:30.196149 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.196143 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-service-ca\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:53:30.196247 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.196161 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-console-config\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:53:30.196319 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.196273 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2d419d99-ee09-4b6e-8cea-05462672af45" (UID: "2d419d99-ee09-4b6e-8cea-05462672af45"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:53:30.197713 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.197688 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d419d99-ee09-4b6e-8cea-05462672af45-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2d419d99-ee09-4b6e-8cea-05462672af45" (UID: "2d419d99-ee09-4b6e-8cea-05462672af45"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:53:30.197817 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.197736 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d419d99-ee09-4b6e-8cea-05462672af45-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2d419d99-ee09-4b6e-8cea-05462672af45" (UID: "2d419d99-ee09-4b6e-8cea-05462672af45"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:53:30.197817 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.197757 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d419d99-ee09-4b6e-8cea-05462672af45-kube-api-access-r42tp" (OuterVolumeSpecName: "kube-api-access-r42tp") pod "2d419d99-ee09-4b6e-8cea-05462672af45" (UID: "2d419d99-ee09-4b6e-8cea-05462672af45"). InnerVolumeSpecName "kube-api-access-r42tp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:53:30.297413 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.297371 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r42tp\" (UniqueName: \"kubernetes.io/projected/2d419d99-ee09-4b6e-8cea-05462672af45-kube-api-access-r42tp\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:53:30.297413 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.297410 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-trusted-ca-bundle\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:53:30.297413 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.297420 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d419d99-ee09-4b6e-8cea-05462672af45-console-serving-cert\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:53:30.297599 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.297429 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d419d99-ee09-4b6e-8cea-05462672af45-oauth-serving-cert\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:53:30.297599 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.297438 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d419d99-ee09-4b6e-8cea-05462672af45-console-oauth-config\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:53:30.915679 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.915649 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f75c646df-l4j42_2d419d99-ee09-4b6e-8cea-05462672af45/console/0.log" Apr 20 21:53:30.916147 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.915688 2566 generic.go:358] "Generic (PLEG): container finished" podID="2d419d99-ee09-4b6e-8cea-05462672af45" containerID="268dd382d08beaaa9895038b21f048a175682c441b0ed79fbfeb2c8c38cd2cfc" exitCode=2 Apr 20 21:53:30.916147 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.915738 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f75c646df-l4j42" event={"ID":"2d419d99-ee09-4b6e-8cea-05462672af45","Type":"ContainerDied","Data":"268dd382d08beaaa9895038b21f048a175682c441b0ed79fbfeb2c8c38cd2cfc"} Apr 20 21:53:30.916147 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.915761 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f75c646df-l4j42" event={"ID":"2d419d99-ee09-4b6e-8cea-05462672af45","Type":"ContainerDied","Data":"11e89faa5654cce552797a4e335010eb179f1735a3ab579e916312871d4ef2ee"} Apr 20 21:53:30.916147 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.915784 2566 scope.go:117] "RemoveContainer" containerID="268dd382d08beaaa9895038b21f048a175682c441b0ed79fbfeb2c8c38cd2cfc" Apr 20 21:53:30.916147 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.915793 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f75c646df-l4j42" Apr 20 21:53:30.925525 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.925346 2566 scope.go:117] "RemoveContainer" containerID="268dd382d08beaaa9895038b21f048a175682c441b0ed79fbfeb2c8c38cd2cfc" Apr 20 21:53:30.925629 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:53:30.925612 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268dd382d08beaaa9895038b21f048a175682c441b0ed79fbfeb2c8c38cd2cfc\": container with ID starting with 268dd382d08beaaa9895038b21f048a175682c441b0ed79fbfeb2c8c38cd2cfc not found: ID does not exist" containerID="268dd382d08beaaa9895038b21f048a175682c441b0ed79fbfeb2c8c38cd2cfc" Apr 20 21:53:30.925671 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.925635 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268dd382d08beaaa9895038b21f048a175682c441b0ed79fbfeb2c8c38cd2cfc"} err="failed to get container status \"268dd382d08beaaa9895038b21f048a175682c441b0ed79fbfeb2c8c38cd2cfc\": rpc error: code = NotFound desc = could not find container \"268dd382d08beaaa9895038b21f048a175682c441b0ed79fbfeb2c8c38cd2cfc\": container with ID starting with 268dd382d08beaaa9895038b21f048a175682c441b0ed79fbfeb2c8c38cd2cfc not found: ID does not exist" Apr 20 21:53:30.936348 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.936326 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f75c646df-l4j42"] Apr 20 21:53:30.939250 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:30.939232 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f75c646df-l4j42"] Apr 20 21:53:31.775527 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:31.775496 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d419d99-ee09-4b6e-8cea-05462672af45" path="/var/lib/kubelet/pods/2d419d99-ee09-4b6e-8cea-05462672af45/volumes" Apr 20 21:53:53.856044 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.856013 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc"] Apr 20 21:53:53.856481 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.856358 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d419d99-ee09-4b6e-8cea-05462672af45" containerName="console" Apr 20 21:53:53.856481 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.856371 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d419d99-ee09-4b6e-8cea-05462672af45" containerName="console" Apr 20 21:53:53.856481 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.856418 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d419d99-ee09-4b6e-8cea-05462672af45" containerName="console" Apr 20 21:53:53.860579 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.860561 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" Apr 20 21:53:53.864129 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.864100 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 21:53:53.865357 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.865327 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 21:53:53.865600 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.865585 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5b4bm\"" Apr 20 21:53:53.867671 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.867634 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc"] Apr 20 21:53:53.883867 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.883846 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09e0739c-1805-473a-8337-9f492b8d3c70-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc\" (UID: \"09e0739c-1805-473a-8337-9f492b8d3c70\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" Apr 20 21:53:53.883982 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.883887 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhswh\" (UniqueName: \"kubernetes.io/projected/09e0739c-1805-473a-8337-9f492b8d3c70-kube-api-access-bhswh\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc\" (UID: \"09e0739c-1805-473a-8337-9f492b8d3c70\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" Apr 20 21:53:53.883982 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.883910 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09e0739c-1805-473a-8337-9f492b8d3c70-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc\" (UID: \"09e0739c-1805-473a-8337-9f492b8d3c70\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" Apr 20 21:53:53.984885 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.984864 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhswh\" (UniqueName: \"kubernetes.io/projected/09e0739c-1805-473a-8337-9f492b8d3c70-kube-api-access-bhswh\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc\" (UID: \"09e0739c-1805-473a-8337-9f492b8d3c70\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" Apr 20 21:53:53.985017 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.984895 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09e0739c-1805-473a-8337-9f492b8d3c70-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc\" (UID: \"09e0739c-1805-473a-8337-9f492b8d3c70\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" Apr 20 21:53:53.985017 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.984943 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09e0739c-1805-473a-8337-9f492b8d3c70-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc\" (UID: \"09e0739c-1805-473a-8337-9f492b8d3c70\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" Apr 20 21:53:53.985256 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.985239 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09e0739c-1805-473a-8337-9f492b8d3c70-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc\" (UID: \"09e0739c-1805-473a-8337-9f492b8d3c70\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" Apr 20 21:53:53.985353 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.985315 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09e0739c-1805-473a-8337-9f492b8d3c70-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc\" (UID: \"09e0739c-1805-473a-8337-9f492b8d3c70\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" Apr 20 21:53:53.992870 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:53.992848 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhswh\" (UniqueName: \"kubernetes.io/projected/09e0739c-1805-473a-8337-9f492b8d3c70-kube-api-access-bhswh\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc\" (UID: \"09e0739c-1805-473a-8337-9f492b8d3c70\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" Apr 20 21:53:54.171013 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:54.170918 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" Apr 20 21:53:54.283769 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:54.283740 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc"] Apr 20 21:53:54.286894 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:53:54.286869 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09e0739c_1805_473a_8337_9f492b8d3c70.slice/crio-bbf36b140eb2333dca9aef6ae5b031ada82391221a0283fdb68992d61961532e WatchSource:0}: Error finding container bbf36b140eb2333dca9aef6ae5b031ada82391221a0283fdb68992d61961532e: Status 404 returned error can't find the container with id bbf36b140eb2333dca9aef6ae5b031ada82391221a0283fdb68992d61961532e Apr 20 21:53:54.981407 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:54.981371 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" event={"ID":"09e0739c-1805-473a-8337-9f492b8d3c70","Type":"ContainerStarted","Data":"bbf36b140eb2333dca9aef6ae5b031ada82391221a0283fdb68992d61961532e"} Apr 20 21:53:59.998664 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:59.998631 2566 generic.go:358] "Generic (PLEG): container finished" podID="09e0739c-1805-473a-8337-9f492b8d3c70" containerID="2c9e0aa51f50178712048d3af7ffdf0f2a48c24c5d065ec966bd931dfa9c6e24" exitCode=0 Apr 20 21:53:59.999103 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:53:59.998710 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" event={"ID":"09e0739c-1805-473a-8337-9f492b8d3c70","Type":"ContainerDied","Data":"2c9e0aa51f50178712048d3af7ffdf0f2a48c24c5d065ec966bd931dfa9c6e24"} Apr 20 21:54:03.008369 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:03.008336 2566 generic.go:358] "Generic (PLEG): container finished" podID="09e0739c-1805-473a-8337-9f492b8d3c70" containerID="37713e01e9747f23c010d5a867517d9c08285f6a1023a6fc7f2f294de567dbb5" exitCode=0 Apr 20 21:54:03.008745 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:03.008432 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" event={"ID":"09e0739c-1805-473a-8337-9f492b8d3c70","Type":"ContainerDied","Data":"37713e01e9747f23c010d5a867517d9c08285f6a1023a6fc7f2f294de567dbb5"} Apr 20 21:54:09.029552 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:09.029461 2566 generic.go:358] "Generic (PLEG): container finished" podID="09e0739c-1805-473a-8337-9f492b8d3c70" containerID="fdf491a2d0b435ef1e66510d58ebe478e6a99ce8c2073a01ca2316e87983c144" exitCode=0 Apr 20 21:54:09.029879 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:09.029548 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" event={"ID":"09e0739c-1805-473a-8337-9f492b8d3c70","Type":"ContainerDied","Data":"fdf491a2d0b435ef1e66510d58ebe478e6a99ce8c2073a01ca2316e87983c144"} Apr 20 21:54:10.149594 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:10.149569 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" Apr 20 21:54:10.210058 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:10.210029 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09e0739c-1805-473a-8337-9f492b8d3c70-bundle\") pod \"09e0739c-1805-473a-8337-9f492b8d3c70\" (UID: \"09e0739c-1805-473a-8337-9f492b8d3c70\") " Apr 20 21:54:10.210187 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:10.210086 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09e0739c-1805-473a-8337-9f492b8d3c70-util\") pod \"09e0739c-1805-473a-8337-9f492b8d3c70\" (UID: \"09e0739c-1805-473a-8337-9f492b8d3c70\") " Apr 20 21:54:10.210187 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:10.210126 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhswh\" (UniqueName: \"kubernetes.io/projected/09e0739c-1805-473a-8337-9f492b8d3c70-kube-api-access-bhswh\") pod \"09e0739c-1805-473a-8337-9f492b8d3c70\" (UID: \"09e0739c-1805-473a-8337-9f492b8d3c70\") " Apr 20 21:54:10.210660 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:10.210629 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09e0739c-1805-473a-8337-9f492b8d3c70-bundle" (OuterVolumeSpecName: "bundle") pod "09e0739c-1805-473a-8337-9f492b8d3c70" (UID: "09e0739c-1805-473a-8337-9f492b8d3c70"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:54:10.212237 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:10.212215 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09e0739c-1805-473a-8337-9f492b8d3c70-kube-api-access-bhswh" (OuterVolumeSpecName: "kube-api-access-bhswh") pod "09e0739c-1805-473a-8337-9f492b8d3c70" (UID: "09e0739c-1805-473a-8337-9f492b8d3c70"). InnerVolumeSpecName "kube-api-access-bhswh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:54:10.214998 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:10.214979 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09e0739c-1805-473a-8337-9f492b8d3c70-util" (OuterVolumeSpecName: "util") pod "09e0739c-1805-473a-8337-9f492b8d3c70" (UID: "09e0739c-1805-473a-8337-9f492b8d3c70"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:54:10.311402 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:10.311329 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09e0739c-1805-473a-8337-9f492b8d3c70-util\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:54:10.311402 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:10.311364 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bhswh\" (UniqueName: \"kubernetes.io/projected/09e0739c-1805-473a-8337-9f492b8d3c70-kube-api-access-bhswh\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:54:10.311402 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:10.311374 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09e0739c-1805-473a-8337-9f492b8d3c70-bundle\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:54:11.037224 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:11.037185 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" event={"ID":"09e0739c-1805-473a-8337-9f492b8d3c70","Type":"ContainerDied","Data":"bbf36b140eb2333dca9aef6ae5b031ada82391221a0283fdb68992d61961532e"} Apr 20 21:54:11.037224 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:11.037221 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbf36b140eb2333dca9aef6ae5b031ada82391221a0283fdb68992d61961532e" Apr 20 21:54:11.037224 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:11.037222 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d7lmrc" Apr 20 21:54:16.734694 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.734659 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vkcjq"] Apr 20 21:54:16.735125 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.734995 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09e0739c-1805-473a-8337-9f492b8d3c70" containerName="util" Apr 20 21:54:16.735125 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.735007 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e0739c-1805-473a-8337-9f492b8d3c70" containerName="util" Apr 20 21:54:16.735125 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.735015 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09e0739c-1805-473a-8337-9f492b8d3c70" containerName="extract" Apr 20 21:54:16.735125 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.735021 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e0739c-1805-473a-8337-9f492b8d3c70" containerName="extract" Apr 20 21:54:16.735125 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.735028 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09e0739c-1805-473a-8337-9f492b8d3c70" containerName="pull" Apr 20 21:54:16.735125 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.735034 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e0739c-1805-473a-8337-9f492b8d3c70" containerName="pull" Apr 20 21:54:16.735125 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.735074 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="09e0739c-1805-473a-8337-9f492b8d3c70" containerName="extract" Apr 20 21:54:16.741378 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.741361 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vkcjq" Apr 20 21:54:16.744205 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.744170 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 21:54:16.744348 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.744204 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-nk28m\"" Apr 20 21:54:16.744719 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.744698 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:54:16.747447 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.747422 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vkcjq"] Apr 20 21:54:16.863728 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.863694 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvq6m\" (UniqueName: \"kubernetes.io/projected/a6423a97-a4d1-4243-82e9-2bb59026aa12-kube-api-access-tvq6m\") pod \"cert-manager-operator-controller-manager-54b9655956-vkcjq\" (UID: \"a6423a97-a4d1-4243-82e9-2bb59026aa12\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vkcjq" Apr 20 21:54:16.863896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.863741 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a6423a97-a4d1-4243-82e9-2bb59026aa12-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-vkcjq\" (UID: \"a6423a97-a4d1-4243-82e9-2bb59026aa12\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vkcjq" Apr 20 21:54:16.964275 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.964230 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a6423a97-a4d1-4243-82e9-2bb59026aa12-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-vkcjq\" (UID: \"a6423a97-a4d1-4243-82e9-2bb59026aa12\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vkcjq" Apr 20 21:54:16.964413 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.964339 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvq6m\" (UniqueName: \"kubernetes.io/projected/a6423a97-a4d1-4243-82e9-2bb59026aa12-kube-api-access-tvq6m\") pod \"cert-manager-operator-controller-manager-54b9655956-vkcjq\" (UID: \"a6423a97-a4d1-4243-82e9-2bb59026aa12\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vkcjq" Apr 20 21:54:16.964612 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.964594 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a6423a97-a4d1-4243-82e9-2bb59026aa12-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-vkcjq\" (UID: \"a6423a97-a4d1-4243-82e9-2bb59026aa12\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vkcjq" Apr 20 21:54:16.972301 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:16.972265 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvq6m\" (UniqueName: \"kubernetes.io/projected/a6423a97-a4d1-4243-82e9-2bb59026aa12-kube-api-access-tvq6m\") pod \"cert-manager-operator-controller-manager-54b9655956-vkcjq\" (UID: \"a6423a97-a4d1-4243-82e9-2bb59026aa12\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vkcjq" Apr 20 21:54:17.051346 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:17.051242 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vkcjq" Apr 20 21:54:17.170049 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:17.170025 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vkcjq"] Apr 20 21:54:17.172276 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:54:17.172247 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6423a97_a4d1_4243_82e9_2bb59026aa12.slice/crio-bfcd7160ca221ccee0df3a32685ae73e3fc3ec10166ae3e31cf131bee8fb6413 WatchSource:0}: Error finding container bfcd7160ca221ccee0df3a32685ae73e3fc3ec10166ae3e31cf131bee8fb6413: Status 404 returned error can't find the container with id bfcd7160ca221ccee0df3a32685ae73e3fc3ec10166ae3e31cf131bee8fb6413 Apr 20 21:54:18.057947 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:18.057910 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vkcjq" event={"ID":"a6423a97-a4d1-4243-82e9-2bb59026aa12","Type":"ContainerStarted","Data":"bfcd7160ca221ccee0df3a32685ae73e3fc3ec10166ae3e31cf131bee8fb6413"} Apr 20 21:54:22.071169 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:22.071133 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vkcjq" event={"ID":"a6423a97-a4d1-4243-82e9-2bb59026aa12","Type":"ContainerStarted","Data":"7dbc2b0abcf897d8386a75119143c4974aee11473e98dc50386de7da305aa4a6"} Apr 20 21:54:22.089074 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:22.089029 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vkcjq" podStartSLOduration=2.199555758 podStartE2EDuration="6.089014917s" podCreationTimestamp="2026-04-20 21:54:16 +0000 UTC" firstStartedPulling="2026-04-20 21:54:17.174710858 +0000 UTC m=+425.983398204" lastFinishedPulling="2026-04-20 21:54:21.06417002 +0000 UTC m=+429.872857363" observedRunningTime="2026-04-20 21:54:22.088248009 +0000 UTC m=+430.896935373" watchObservedRunningTime="2026-04-20 21:54:22.089014917 +0000 UTC m=+430.897702281" Apr 20 21:54:23.245523 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.245492 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw"] Apr 20 21:54:23.249045 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.249021 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" Apr 20 21:54:23.251627 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.251602 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 21:54:23.251718 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.251602 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 21:54:23.252710 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.252692 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5b4bm\"" Apr 20 21:54:23.256161 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.255980 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw"] Apr 20 21:54:23.311434 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.311405 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw\" (UID: \"aa492c56-55e4-435a-8ba8-5e0ef51ea8da\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" Apr 20 21:54:23.311574 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.311454 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw\" (UID: \"aa492c56-55e4-435a-8ba8-5e0ef51ea8da\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" Apr 20 21:54:23.311574 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.311510 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h6h5\" (UniqueName: \"kubernetes.io/projected/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-kube-api-access-8h6h5\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw\" (UID: \"aa492c56-55e4-435a-8ba8-5e0ef51ea8da\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" Apr 20 21:54:23.412864 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.412835 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw\" (UID: \"aa492c56-55e4-435a-8ba8-5e0ef51ea8da\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" Apr 20 21:54:23.412988 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.412868 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8h6h5\" (UniqueName: \"kubernetes.io/projected/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-kube-api-access-8h6h5\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw\" (UID: \"aa492c56-55e4-435a-8ba8-5e0ef51ea8da\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" Apr 20 21:54:23.412988 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.412917 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw\" (UID: \"aa492c56-55e4-435a-8ba8-5e0ef51ea8da\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" Apr 20 21:54:23.413213 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.413194 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw\" (UID: \"aa492c56-55e4-435a-8ba8-5e0ef51ea8da\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" Apr 20 21:54:23.413258 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.413230 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw\" (UID: \"aa492c56-55e4-435a-8ba8-5e0ef51ea8da\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" Apr 20 21:54:23.423034 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.423008 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h6h5\" (UniqueName: \"kubernetes.io/projected/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-kube-api-access-8h6h5\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw\" (UID: \"aa492c56-55e4-435a-8ba8-5e0ef51ea8da\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" Apr 20 21:54:23.559430 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.559368 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" Apr 20 21:54:23.684758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:23.684736 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw"] Apr 20 21:54:23.687190 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:54:23.687162 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa492c56_55e4_435a_8ba8_5e0ef51ea8da.slice/crio-4bbe36337c5739bf124f786030bfe1945dd61bff9ae2675b28f53937de185476 WatchSource:0}: Error finding container 4bbe36337c5739bf124f786030bfe1945dd61bff9ae2675b28f53937de185476: Status 404 returned error can't find the container with id 4bbe36337c5739bf124f786030bfe1945dd61bff9ae2675b28f53937de185476 Apr 20 21:54:24.078276 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.078242 2566 generic.go:358] "Generic (PLEG): container finished" podID="aa492c56-55e4-435a-8ba8-5e0ef51ea8da" containerID="1a1f5c8f0e55ba64603f4d60234e6fe0c1a9ca0d9891b07e5e3b1e63bf1c3d08" exitCode=0 Apr 20 21:54:24.078435 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.078319 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" event={"ID":"aa492c56-55e4-435a-8ba8-5e0ef51ea8da","Type":"ContainerDied","Data":"1a1f5c8f0e55ba64603f4d60234e6fe0c1a9ca0d9891b07e5e3b1e63bf1c3d08"} Apr 20 21:54:24.078435 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.078351 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" event={"ID":"aa492c56-55e4-435a-8ba8-5e0ef51ea8da","Type":"ContainerStarted","Data":"4bbe36337c5739bf124f786030bfe1945dd61bff9ae2675b28f53937de185476"} Apr 20 21:54:24.598315 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.598271 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-nxq5f"] Apr 20 21:54:24.601438 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.601421 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-nxq5f" Apr 20 21:54:24.603825 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.603802 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 21:54:24.605027 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.605008 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-2d59r\"" Apr 20 21:54:24.605144 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.605056 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 21:54:24.610859 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.610835 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-nxq5f"] Apr 20 21:54:24.724413 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.724390 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/422fbbca-96d4-4543-9f53-fa1ea9c8d9e9-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-nxq5f\" (UID: \"422fbbca-96d4-4543-9f53-fa1ea9c8d9e9\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nxq5f" Apr 20 21:54:24.724545 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.724428 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz4bd\" (UniqueName: \"kubernetes.io/projected/422fbbca-96d4-4543-9f53-fa1ea9c8d9e9-kube-api-access-rz4bd\") pod \"cert-manager-webhook-587ccfb98-nxq5f\" (UID: \"422fbbca-96d4-4543-9f53-fa1ea9c8d9e9\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nxq5f" Apr 20 21:54:24.825646 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.825610 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/422fbbca-96d4-4543-9f53-fa1ea9c8d9e9-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-nxq5f\" (UID: \"422fbbca-96d4-4543-9f53-fa1ea9c8d9e9\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nxq5f" Apr 20 21:54:24.825768 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.825651 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rz4bd\" (UniqueName: \"kubernetes.io/projected/422fbbca-96d4-4543-9f53-fa1ea9c8d9e9-kube-api-access-rz4bd\") pod \"cert-manager-webhook-587ccfb98-nxq5f\" (UID: \"422fbbca-96d4-4543-9f53-fa1ea9c8d9e9\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nxq5f" Apr 20 21:54:24.833341 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.833315 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/422fbbca-96d4-4543-9f53-fa1ea9c8d9e9-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-nxq5f\" (UID: \"422fbbca-96d4-4543-9f53-fa1ea9c8d9e9\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nxq5f" Apr 20 21:54:24.833648 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.833628 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz4bd\" (UniqueName: \"kubernetes.io/projected/422fbbca-96d4-4543-9f53-fa1ea9c8d9e9-kube-api-access-rz4bd\") pod \"cert-manager-webhook-587ccfb98-nxq5f\" (UID: \"422fbbca-96d4-4543-9f53-fa1ea9c8d9e9\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nxq5f" Apr 20 21:54:24.926127 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:24.926100 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-nxq5f" Apr 20 21:54:25.044433 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:25.044409 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-nxq5f"] Apr 20 21:54:25.046443 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:54:25.046414 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod422fbbca_96d4_4543_9f53_fa1ea9c8d9e9.slice/crio-00cb8345663943d8b89c6ebe9538bf08e8e5cd920b23e026f3baa38ffe26e012 WatchSource:0}: Error finding container 00cb8345663943d8b89c6ebe9538bf08e8e5cd920b23e026f3baa38ffe26e012: Status 404 returned error can't find the container with id 00cb8345663943d8b89c6ebe9538bf08e8e5cd920b23e026f3baa38ffe26e012 Apr 20 21:54:25.082768 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:25.082743 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-nxq5f" event={"ID":"422fbbca-96d4-4543-9f53-fa1ea9c8d9e9","Type":"ContainerStarted","Data":"00cb8345663943d8b89c6ebe9538bf08e8e5cd920b23e026f3baa38ffe26e012"} Apr 20 21:54:29.097380 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:29.097347 2566 generic.go:358] "Generic (PLEG): container finished" podID="aa492c56-55e4-435a-8ba8-5e0ef51ea8da" containerID="05616b2632f9e002691e1a3aef156657eb81995c19cc9235db56d4c5533d0a09" exitCode=0 Apr 20 21:54:29.097823 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:29.097429 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" event={"ID":"aa492c56-55e4-435a-8ba8-5e0ef51ea8da","Type":"ContainerDied","Data":"05616b2632f9e002691e1a3aef156657eb81995c19cc9235db56d4c5533d0a09"} Apr 20 21:54:29.098846 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:29.098823 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-nxq5f" event={"ID":"422fbbca-96d4-4543-9f53-fa1ea9c8d9e9","Type":"ContainerStarted","Data":"34983ea2b898e700512d97457a44effbf43596c2fe28ffe73d5d780696f43de1"} Apr 20 21:54:29.098989 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:29.098937 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-nxq5f" Apr 20 21:54:29.127390 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:29.127342 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-nxq5f" podStartSLOduration=1.8731624500000001 podStartE2EDuration="5.127327618s" podCreationTimestamp="2026-04-20 21:54:24 +0000 UTC" firstStartedPulling="2026-04-20 21:54:25.050632167 +0000 UTC m=+433.859319509" lastFinishedPulling="2026-04-20 21:54:28.304797334 +0000 UTC m=+437.113484677" observedRunningTime="2026-04-20 21:54:29.126066316 +0000 UTC m=+437.934753680" watchObservedRunningTime="2026-04-20 21:54:29.127327618 +0000 UTC m=+437.936014990" Apr 20 21:54:30.103779 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:30.103745 2566 generic.go:358] "Generic (PLEG): container finished" podID="aa492c56-55e4-435a-8ba8-5e0ef51ea8da" containerID="5a789ab4f9c7fe0f9e806fd9ce0d11952168461bfb77f01e0046105f8599e5a3" exitCode=0 Apr 20 21:54:30.104180 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:30.103825 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" event={"ID":"aa492c56-55e4-435a-8ba8-5e0ef51ea8da","Type":"ContainerDied","Data":"5a789ab4f9c7fe0f9e806fd9ce0d11952168461bfb77f01e0046105f8599e5a3"} Apr 20 21:54:31.227770 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:31.227744 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" Apr 20 21:54:31.281415 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:31.281383 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-bundle\") pod \"aa492c56-55e4-435a-8ba8-5e0ef51ea8da\" (UID: \"aa492c56-55e4-435a-8ba8-5e0ef51ea8da\") " Apr 20 21:54:31.281571 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:31.281485 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-util\") pod \"aa492c56-55e4-435a-8ba8-5e0ef51ea8da\" (UID: \"aa492c56-55e4-435a-8ba8-5e0ef51ea8da\") " Apr 20 21:54:31.281571 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:31.281517 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h6h5\" (UniqueName: \"kubernetes.io/projected/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-kube-api-access-8h6h5\") pod \"aa492c56-55e4-435a-8ba8-5e0ef51ea8da\" (UID: \"aa492c56-55e4-435a-8ba8-5e0ef51ea8da\") " Apr 20 21:54:31.281814 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:31.281786 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-bundle" (OuterVolumeSpecName: "bundle") pod "aa492c56-55e4-435a-8ba8-5e0ef51ea8da" (UID: "aa492c56-55e4-435a-8ba8-5e0ef51ea8da"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:54:31.283639 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:31.283617 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-kube-api-access-8h6h5" (OuterVolumeSpecName: "kube-api-access-8h6h5") pod "aa492c56-55e4-435a-8ba8-5e0ef51ea8da" (UID: "aa492c56-55e4-435a-8ba8-5e0ef51ea8da"). InnerVolumeSpecName "kube-api-access-8h6h5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:54:31.285455 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:31.285419 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-util" (OuterVolumeSpecName: "util") pod "aa492c56-55e4-435a-8ba8-5e0ef51ea8da" (UID: "aa492c56-55e4-435a-8ba8-5e0ef51ea8da"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:54:31.382369 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:31.382312 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-util\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:54:31.382369 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:31.382337 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8h6h5\" (UniqueName: \"kubernetes.io/projected/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-kube-api-access-8h6h5\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:54:31.382369 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:31.382347 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa492c56-55e4-435a-8ba8-5e0ef51ea8da-bundle\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:54:32.112446 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:32.112359 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" event={"ID":"aa492c56-55e4-435a-8ba8-5e0ef51ea8da","Type":"ContainerDied","Data":"4bbe36337c5739bf124f786030bfe1945dd61bff9ae2675b28f53937de185476"} Apr 20 21:54:32.112446 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:32.112388 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9szjw" Apr 20 21:54:32.112617 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:32.112392 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bbe36337c5739bf124f786030bfe1945dd61bff9ae2675b28f53937de185476" Apr 20 21:54:35.106093 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.106065 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-nxq5f" Apr 20 21:54:35.128463 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.128428 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-6jq8m"] Apr 20 21:54:35.128824 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.128806 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa492c56-55e4-435a-8ba8-5e0ef51ea8da" containerName="pull" Apr 20 21:54:35.128824 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.128826 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa492c56-55e4-435a-8ba8-5e0ef51ea8da" containerName="pull" Apr 20 21:54:35.128972 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.128849 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa492c56-55e4-435a-8ba8-5e0ef51ea8da" containerName="extract" Apr 20 21:54:35.128972 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.128857 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa492c56-55e4-435a-8ba8-5e0ef51ea8da" containerName="extract" Apr 20 21:54:35.128972 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.128875 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa492c56-55e4-435a-8ba8-5e0ef51ea8da" containerName="util" Apr 20 21:54:35.128972 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.128884 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa492c56-55e4-435a-8ba8-5e0ef51ea8da" containerName="util" Apr 20 21:54:35.128972 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.128962 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa492c56-55e4-435a-8ba8-5e0ef51ea8da" containerName="extract" Apr 20 21:54:35.133212 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.133184 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-6jq8m" Apr 20 21:54:35.135669 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.135649 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-p26vt\"" Apr 20 21:54:35.141308 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.141269 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-6jq8m"] Apr 20 21:54:35.212250 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.212214 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws9gv\" (UniqueName: \"kubernetes.io/projected/53d022bc-36c3-4790-afa6-045f4a7f787b-kube-api-access-ws9gv\") pod \"cert-manager-79c8d999ff-6jq8m\" (UID: \"53d022bc-36c3-4790-afa6-045f4a7f787b\") " pod="cert-manager/cert-manager-79c8d999ff-6jq8m" Apr 20 21:54:35.212435 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.212367 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53d022bc-36c3-4790-afa6-045f4a7f787b-bound-sa-token\") pod \"cert-manager-79c8d999ff-6jq8m\" (UID: \"53d022bc-36c3-4790-afa6-045f4a7f787b\") " pod="cert-manager/cert-manager-79c8d999ff-6jq8m" Apr 20 21:54:35.312769 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.312721 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53d022bc-36c3-4790-afa6-045f4a7f787b-bound-sa-token\") pod \"cert-manager-79c8d999ff-6jq8m\" (UID: \"53d022bc-36c3-4790-afa6-045f4a7f787b\") " pod="cert-manager/cert-manager-79c8d999ff-6jq8m" Apr 20 21:54:35.312937 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.312791 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ws9gv\" (UniqueName: \"kubernetes.io/projected/53d022bc-36c3-4790-afa6-045f4a7f787b-kube-api-access-ws9gv\") pod \"cert-manager-79c8d999ff-6jq8m\" (UID: \"53d022bc-36c3-4790-afa6-045f4a7f787b\") " pod="cert-manager/cert-manager-79c8d999ff-6jq8m" Apr 20 21:54:35.320696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.320664 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53d022bc-36c3-4790-afa6-045f4a7f787b-bound-sa-token\") pod \"cert-manager-79c8d999ff-6jq8m\" (UID: \"53d022bc-36c3-4790-afa6-045f4a7f787b\") " pod="cert-manager/cert-manager-79c8d999ff-6jq8m" Apr 20 21:54:35.320879 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.320856 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws9gv\" (UniqueName: \"kubernetes.io/projected/53d022bc-36c3-4790-afa6-045f4a7f787b-kube-api-access-ws9gv\") pod \"cert-manager-79c8d999ff-6jq8m\" (UID: \"53d022bc-36c3-4790-afa6-045f4a7f787b\") " pod="cert-manager/cert-manager-79c8d999ff-6jq8m" Apr 20 21:54:35.443653 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.443569 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-6jq8m" Apr 20 21:54:35.559422 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:35.559397 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-6jq8m"] Apr 20 21:54:35.561982 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:54:35.561943 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d022bc_36c3_4790_afa6_045f4a7f787b.slice/crio-85bb71b85e651c3dc911f512f3d5397bc0fae171e9d49d3be327575d0ac3015f WatchSource:0}: Error finding container 85bb71b85e651c3dc911f512f3d5397bc0fae171e9d49d3be327575d0ac3015f: Status 404 returned error can't find the container with id 85bb71b85e651c3dc911f512f3d5397bc0fae171e9d49d3be327575d0ac3015f Apr 20 21:54:36.127310 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:36.127265 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-6jq8m" event={"ID":"53d022bc-36c3-4790-afa6-045f4a7f787b","Type":"ContainerStarted","Data":"5ec794ce8b39d7927d806860f3a87ec4684905cef627b381b9d0219bac0e1acd"} Apr 20 21:54:36.127310 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:36.127310 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-6jq8m" event={"ID":"53d022bc-36c3-4790-afa6-045f4a7f787b","Type":"ContainerStarted","Data":"85bb71b85e651c3dc911f512f3d5397bc0fae171e9d49d3be327575d0ac3015f"} Apr 20 21:54:36.143486 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:36.143440 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-6jq8m" podStartSLOduration=1.14342769 podStartE2EDuration="1.14342769s" podCreationTimestamp="2026-04-20 21:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:54:36.141258031 +0000 UTC m=+444.949945397" watchObservedRunningTime="2026-04-20 21:54:36.14342769 +0000 UTC m=+444.952115055" Apr 20 21:54:37.548216 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:37.548183 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-jkhkr"] Apr 20 21:54:37.551670 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:37.551655 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jkhkr" Apr 20 21:54:37.554134 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:37.554113 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 21:54:37.555123 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:37.555108 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:54:37.555177 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:37.555127 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-5twsm\"" Apr 20 21:54:37.558472 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:37.558451 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-jkhkr"] Apr 20 21:54:37.632178 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:37.632155 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5bb83dec-e814-4768-b568-b10bb95213b7-tmp\") pod \"openshift-lws-operator-bfc7f696d-jkhkr\" (UID: \"5bb83dec-e814-4768-b568-b10bb95213b7\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jkhkr" Apr 20 21:54:37.632324 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:37.632211 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b2wd\" (UniqueName: \"kubernetes.io/projected/5bb83dec-e814-4768-b568-b10bb95213b7-kube-api-access-7b2wd\") pod \"openshift-lws-operator-bfc7f696d-jkhkr\" (UID: \"5bb83dec-e814-4768-b568-b10bb95213b7\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jkhkr" Apr 20 21:54:37.733440 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:37.733403 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7b2wd\" (UniqueName: \"kubernetes.io/projected/5bb83dec-e814-4768-b568-b10bb95213b7-kube-api-access-7b2wd\") pod \"openshift-lws-operator-bfc7f696d-jkhkr\" (UID: \"5bb83dec-e814-4768-b568-b10bb95213b7\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jkhkr" Apr 20 21:54:37.733563 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:37.733505 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5bb83dec-e814-4768-b568-b10bb95213b7-tmp\") pod \"openshift-lws-operator-bfc7f696d-jkhkr\" (UID: \"5bb83dec-e814-4768-b568-b10bb95213b7\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jkhkr" Apr 20 21:54:37.733827 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:37.733811 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5bb83dec-e814-4768-b568-b10bb95213b7-tmp\") pod \"openshift-lws-operator-bfc7f696d-jkhkr\" (UID: \"5bb83dec-e814-4768-b568-b10bb95213b7\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jkhkr" Apr 20 21:54:37.740899 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:37.740880 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b2wd\" (UniqueName: \"kubernetes.io/projected/5bb83dec-e814-4768-b568-b10bb95213b7-kube-api-access-7b2wd\") pod \"openshift-lws-operator-bfc7f696d-jkhkr\" (UID: \"5bb83dec-e814-4768-b568-b10bb95213b7\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jkhkr" Apr 20 21:54:37.861222 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:37.861157 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jkhkr" Apr 20 21:54:37.973543 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:37.973519 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-jkhkr"] Apr 20 21:54:37.975135 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:54:37.975111 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bb83dec_e814_4768_b568_b10bb95213b7.slice/crio-193b0aaf7cd8ba73e1766f485cd5547f64914b4b832d042a7c40281248613548 WatchSource:0}: Error finding container 193b0aaf7cd8ba73e1766f485cd5547f64914b4b832d042a7c40281248613548: Status 404 returned error can't find the container with id 193b0aaf7cd8ba73e1766f485cd5547f64914b4b832d042a7c40281248613548 Apr 20 21:54:38.135485 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:38.135411 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jkhkr" event={"ID":"5bb83dec-e814-4768-b568-b10bb95213b7","Type":"ContainerStarted","Data":"193b0aaf7cd8ba73e1766f485cd5547f64914b4b832d042a7c40281248613548"} Apr 20 21:54:41.147506 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:41.147425 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jkhkr" event={"ID":"5bb83dec-e814-4768-b568-b10bb95213b7","Type":"ContainerStarted","Data":"22a4bc14bfaa86ddb02fdf9c52b640e8171822d93c506885ed47d26da3d0e77d"} Apr 20 21:54:41.161977 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:41.161922 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jkhkr" podStartSLOduration=1.4056475480000001 podStartE2EDuration="4.161905565s" podCreationTimestamp="2026-04-20 21:54:37 +0000 UTC" firstStartedPulling="2026-04-20 21:54:37.976681455 +0000 UTC m=+446.785368797" lastFinishedPulling="2026-04-20 21:54:40.73293947 +0000 UTC m=+449.541626814" observedRunningTime="2026-04-20 21:54:41.161603859 +0000 UTC m=+449.970291225" watchObservedRunningTime="2026-04-20 21:54:41.161905565 +0000 UTC m=+449.970592932" Apr 20 21:54:43.466335 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.466302 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9"] Apr 20 21:54:43.498101 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.498071 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9"] Apr 20 21:54:43.498232 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.498113 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" Apr 20 21:54:43.501000 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.500982 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 21:54:43.501139 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.501027 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 21:54:43.502254 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.502235 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5b4bm\"" Apr 20 21:54:43.586457 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.586433 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82c155c8-2a71-427a-8c96-e5a630da7cd5-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9\" (UID: \"82c155c8-2a71-427a-8c96-e5a630da7cd5\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" Apr 20 21:54:43.586587 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.586468 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82c155c8-2a71-427a-8c96-e5a630da7cd5-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9\" (UID: \"82c155c8-2a71-427a-8c96-e5a630da7cd5\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" Apr 20 21:54:43.586587 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.586517 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpsnb\" (UniqueName: \"kubernetes.io/projected/82c155c8-2a71-427a-8c96-e5a630da7cd5-kube-api-access-mpsnb\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9\" (UID: \"82c155c8-2a71-427a-8c96-e5a630da7cd5\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" Apr 20 21:54:43.687527 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.687495 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpsnb\" (UniqueName: \"kubernetes.io/projected/82c155c8-2a71-427a-8c96-e5a630da7cd5-kube-api-access-mpsnb\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9\" (UID: \"82c155c8-2a71-427a-8c96-e5a630da7cd5\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" Apr 20 21:54:43.687692 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.687554 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82c155c8-2a71-427a-8c96-e5a630da7cd5-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9\" (UID: \"82c155c8-2a71-427a-8c96-e5a630da7cd5\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" Apr 20 21:54:43.687692 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.687580 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82c155c8-2a71-427a-8c96-e5a630da7cd5-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9\" (UID: \"82c155c8-2a71-427a-8c96-e5a630da7cd5\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" Apr 20 21:54:43.687906 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.687884 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82c155c8-2a71-427a-8c96-e5a630da7cd5-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9\" (UID: \"82c155c8-2a71-427a-8c96-e5a630da7cd5\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" Apr 20 21:54:43.687943 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.687905 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82c155c8-2a71-427a-8c96-e5a630da7cd5-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9\" (UID: \"82c155c8-2a71-427a-8c96-e5a630da7cd5\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" Apr 20 21:54:43.697897 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.697863 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpsnb\" (UniqueName: \"kubernetes.io/projected/82c155c8-2a71-427a-8c96-e5a630da7cd5-kube-api-access-mpsnb\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9\" (UID: \"82c155c8-2a71-427a-8c96-e5a630da7cd5\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" Apr 20 21:54:43.807746 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.807673 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" Apr 20 21:54:43.927496 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:43.927463 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9"] Apr 20 21:54:43.930639 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:54:43.930608 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82c155c8_2a71_427a_8c96_e5a630da7cd5.slice/crio-ff915f7e6194e0b3dc3b6b6b3ce7b38b15271c93046b0294609a2af3880db246 WatchSource:0}: Error finding container ff915f7e6194e0b3dc3b6b6b3ce7b38b15271c93046b0294609a2af3880db246: Status 404 returned error can't find the container with id ff915f7e6194e0b3dc3b6b6b3ce7b38b15271c93046b0294609a2af3880db246 Apr 20 21:54:44.158871 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:44.158837 2566 generic.go:358] "Generic (PLEG): container finished" podID="82c155c8-2a71-427a-8c96-e5a630da7cd5" containerID="045acdefc02aef96179c5b2039b3edd0fb97c48bb70febaee885a23d389546b6" exitCode=0 Apr 20 21:54:44.158987 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:44.158885 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" event={"ID":"82c155c8-2a71-427a-8c96-e5a630da7cd5","Type":"ContainerDied","Data":"045acdefc02aef96179c5b2039b3edd0fb97c48bb70febaee885a23d389546b6"} Apr 20 21:54:44.158987 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:44.158906 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" event={"ID":"82c155c8-2a71-427a-8c96-e5a630da7cd5","Type":"ContainerStarted","Data":"ff915f7e6194e0b3dc3b6b6b3ce7b38b15271c93046b0294609a2af3880db246"} Apr 20 21:54:46.167176 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:46.167139 2566 generic.go:358] "Generic (PLEG): container finished" podID="82c155c8-2a71-427a-8c96-e5a630da7cd5" containerID="37ed6d4e1540670237502df1497b1ff61ede60ce58c4c6227978f75c7de6c43f" exitCode=0 Apr 20 21:54:46.167558 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:46.167210 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" event={"ID":"82c155c8-2a71-427a-8c96-e5a630da7cd5","Type":"ContainerDied","Data":"37ed6d4e1540670237502df1497b1ff61ede60ce58c4c6227978f75c7de6c43f"} Apr 20 21:54:47.172071 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:47.172036 2566 generic.go:358] "Generic (PLEG): container finished" podID="82c155c8-2a71-427a-8c96-e5a630da7cd5" containerID="28d59b00ed8d053c3d2d148f2f710687d0013034ff05197a9141d30885157a21" exitCode=0 Apr 20 21:54:47.172516 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:47.172079 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" event={"ID":"82c155c8-2a71-427a-8c96-e5a630da7cd5","Type":"ContainerDied","Data":"28d59b00ed8d053c3d2d148f2f710687d0013034ff05197a9141d30885157a21"} Apr 20 21:54:48.293888 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:48.293867 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" Apr 20 21:54:48.427914 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:48.427833 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82c155c8-2a71-427a-8c96-e5a630da7cd5-bundle\") pod \"82c155c8-2a71-427a-8c96-e5a630da7cd5\" (UID: \"82c155c8-2a71-427a-8c96-e5a630da7cd5\") " Apr 20 21:54:48.427914 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:48.427895 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82c155c8-2a71-427a-8c96-e5a630da7cd5-util\") pod \"82c155c8-2a71-427a-8c96-e5a630da7cd5\" (UID: \"82c155c8-2a71-427a-8c96-e5a630da7cd5\") " Apr 20 21:54:48.428122 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:48.427940 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpsnb\" (UniqueName: \"kubernetes.io/projected/82c155c8-2a71-427a-8c96-e5a630da7cd5-kube-api-access-mpsnb\") pod \"82c155c8-2a71-427a-8c96-e5a630da7cd5\" (UID: \"82c155c8-2a71-427a-8c96-e5a630da7cd5\") " Apr 20 21:54:48.428547 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:48.428519 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82c155c8-2a71-427a-8c96-e5a630da7cd5-bundle" (OuterVolumeSpecName: "bundle") pod "82c155c8-2a71-427a-8c96-e5a630da7cd5" (UID: "82c155c8-2a71-427a-8c96-e5a630da7cd5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:54:48.429961 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:48.429935 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82c155c8-2a71-427a-8c96-e5a630da7cd5-kube-api-access-mpsnb" (OuterVolumeSpecName: "kube-api-access-mpsnb") pod "82c155c8-2a71-427a-8c96-e5a630da7cd5" (UID: "82c155c8-2a71-427a-8c96-e5a630da7cd5"). InnerVolumeSpecName "kube-api-access-mpsnb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:54:48.433594 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:48.433573 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82c155c8-2a71-427a-8c96-e5a630da7cd5-util" (OuterVolumeSpecName: "util") pod "82c155c8-2a71-427a-8c96-e5a630da7cd5" (UID: "82c155c8-2a71-427a-8c96-e5a630da7cd5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:54:48.529382 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:48.529352 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82c155c8-2a71-427a-8c96-e5a630da7cd5-util\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:54:48.529382 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:48.529377 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mpsnb\" (UniqueName: \"kubernetes.io/projected/82c155c8-2a71-427a-8c96-e5a630da7cd5-kube-api-access-mpsnb\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:54:48.529382 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:48.529389 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82c155c8-2a71-427a-8c96-e5a630da7cd5-bundle\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:54:49.180049 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:49.180019 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" event={"ID":"82c155c8-2a71-427a-8c96-e5a630da7cd5","Type":"ContainerDied","Data":"ff915f7e6194e0b3dc3b6b6b3ce7b38b15271c93046b0294609a2af3880db246"} Apr 20 21:54:49.180049 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:49.180051 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff915f7e6194e0b3dc3b6b6b3ce7b38b15271c93046b0294609a2af3880db246" Apr 20 21:54:49.180251 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:49.180068 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fgvh9" Apr 20 21:54:58.866166 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:58.866127 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq"] Apr 20 21:54:58.866544 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:58.866496 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82c155c8-2a71-427a-8c96-e5a630da7cd5" containerName="pull" Apr 20 21:54:58.866544 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:58.866510 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c155c8-2a71-427a-8c96-e5a630da7cd5" containerName="pull" Apr 20 21:54:58.866544 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:58.866517 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82c155c8-2a71-427a-8c96-e5a630da7cd5" containerName="extract" Apr 20 21:54:58.866544 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:58.866524 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c155c8-2a71-427a-8c96-e5a630da7cd5" containerName="extract" Apr 20 21:54:58.866544 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:58.866537 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82c155c8-2a71-427a-8c96-e5a630da7cd5" containerName="util" Apr 20 21:54:58.866544 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:58.866543 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c155c8-2a71-427a-8c96-e5a630da7cd5" containerName="util" Apr 20 21:54:58.866719 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:58.866620 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="82c155c8-2a71-427a-8c96-e5a630da7cd5" containerName="extract" Apr 20 21:54:58.870983 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:58.870966 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" Apr 20 21:54:58.874119 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:58.874089 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5b4bm\"" Apr 20 21:54:58.874255 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:58.874169 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 21:54:58.880166 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:58.880140 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 21:54:58.882618 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:58.882588 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq"] Apr 20 21:54:59.011674 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:59.011645 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kzdh\" (UniqueName: \"kubernetes.io/projected/a0513538-6379-48d2-baae-ce01ae943c00-kube-api-access-4kzdh\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq\" (UID: \"a0513538-6379-48d2-baae-ce01ae943c00\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" Apr 20 21:54:59.011674 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:59.011686 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0513538-6379-48d2-baae-ce01ae943c00-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq\" (UID: \"a0513538-6379-48d2-baae-ce01ae943c00\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" Apr 20 21:54:59.011863 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:59.011750 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0513538-6379-48d2-baae-ce01ae943c00-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq\" (UID: \"a0513538-6379-48d2-baae-ce01ae943c00\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" Apr 20 21:54:59.112447 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:59.112414 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kzdh\" (UniqueName: \"kubernetes.io/projected/a0513538-6379-48d2-baae-ce01ae943c00-kube-api-access-4kzdh\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq\" (UID: \"a0513538-6379-48d2-baae-ce01ae943c00\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" Apr 20 21:54:59.112632 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:59.112459 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0513538-6379-48d2-baae-ce01ae943c00-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq\" (UID: \"a0513538-6379-48d2-baae-ce01ae943c00\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" Apr 20 21:54:59.112632 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:59.112492 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0513538-6379-48d2-baae-ce01ae943c00-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq\" (UID: \"a0513538-6379-48d2-baae-ce01ae943c00\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" Apr 20 21:54:59.112835 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:59.112818 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0513538-6379-48d2-baae-ce01ae943c00-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq\" (UID: \"a0513538-6379-48d2-baae-ce01ae943c00\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" Apr 20 21:54:59.112885 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:59.112852 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0513538-6379-48d2-baae-ce01ae943c00-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq\" (UID: \"a0513538-6379-48d2-baae-ce01ae943c00\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" Apr 20 21:54:59.121595 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:59.121548 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kzdh\" (UniqueName: \"kubernetes.io/projected/a0513538-6379-48d2-baae-ce01ae943c00-kube-api-access-4kzdh\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq\" (UID: \"a0513538-6379-48d2-baae-ce01ae943c00\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" Apr 20 21:54:59.187408 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:59.187369 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" Apr 20 21:54:59.305561 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:54:59.305536 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq"] Apr 20 21:54:59.308055 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:54:59.308022 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0513538_6379_48d2_baae_ce01ae943c00.slice/crio-0d691721d27a2fffcf086145a7d6358118f617ed9d993aadea3814f7cc19049a WatchSource:0}: Error finding container 0d691721d27a2fffcf086145a7d6358118f617ed9d993aadea3814f7cc19049a: Status 404 returned error can't find the container with id 0d691721d27a2fffcf086145a7d6358118f617ed9d993aadea3814f7cc19049a Apr 20 21:55:00.085806 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.085772 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw"] Apr 20 21:55:00.089380 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.089357 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" Apr 20 21:55:00.092785 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.092763 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 21:55:00.093006 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.092990 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 21:55:00.093455 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.093441 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-lmzqf\"" Apr 20 21:55:00.093634 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.093616 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 21:55:00.093802 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.093786 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 21:55:00.114676 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.114644 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw"] Apr 20 21:55:00.216864 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.216830 2566 generic.go:358] "Generic (PLEG): container finished" podID="a0513538-6379-48d2-baae-ce01ae943c00" containerID="161ed18cd929e0108ccf9c3c9eeb66e3298f038119bb88b74f5266445e4d6368" exitCode=0 Apr 20 21:55:00.216986 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.216915 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" event={"ID":"a0513538-6379-48d2-baae-ce01ae943c00","Type":"ContainerDied","Data":"161ed18cd929e0108ccf9c3c9eeb66e3298f038119bb88b74f5266445e4d6368"} Apr 20 21:55:00.216986 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.216951 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" event={"ID":"a0513538-6379-48d2-baae-ce01ae943c00","Type":"ContainerStarted","Data":"0d691721d27a2fffcf086145a7d6358118f617ed9d993aadea3814f7cc19049a"} Apr 20 21:55:00.220065 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.220047 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk424\" (UniqueName: \"kubernetes.io/projected/9ea8269c-9fdb-442a-ae81-5d278f62e768-kube-api-access-mk424\") pod \"opendatahub-operator-controller-manager-f5f47469b-mdkqw\" (UID: \"9ea8269c-9fdb-442a-ae81-5d278f62e768\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" Apr 20 21:55:00.220131 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.220108 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ea8269c-9fdb-442a-ae81-5d278f62e768-apiservice-cert\") pod \"opendatahub-operator-controller-manager-f5f47469b-mdkqw\" (UID: \"9ea8269c-9fdb-442a-ae81-5d278f62e768\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" Apr 20 21:55:00.220131 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.220127 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ea8269c-9fdb-442a-ae81-5d278f62e768-webhook-cert\") pod \"opendatahub-operator-controller-manager-f5f47469b-mdkqw\" (UID: \"9ea8269c-9fdb-442a-ae81-5d278f62e768\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" Apr 20 21:55:00.321382 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.321350 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mk424\" (UniqueName: \"kubernetes.io/projected/9ea8269c-9fdb-442a-ae81-5d278f62e768-kube-api-access-mk424\") pod \"opendatahub-operator-controller-manager-f5f47469b-mdkqw\" (UID: \"9ea8269c-9fdb-442a-ae81-5d278f62e768\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" Apr 20 21:55:00.321521 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.321415 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ea8269c-9fdb-442a-ae81-5d278f62e768-apiservice-cert\") pod \"opendatahub-operator-controller-manager-f5f47469b-mdkqw\" (UID: \"9ea8269c-9fdb-442a-ae81-5d278f62e768\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" Apr 20 21:55:00.321521 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.321436 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ea8269c-9fdb-442a-ae81-5d278f62e768-webhook-cert\") pod \"opendatahub-operator-controller-manager-f5f47469b-mdkqw\" (UID: \"9ea8269c-9fdb-442a-ae81-5d278f62e768\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" Apr 20 21:55:00.323868 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.323842 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ea8269c-9fdb-442a-ae81-5d278f62e768-webhook-cert\") pod \"opendatahub-operator-controller-manager-f5f47469b-mdkqw\" (UID: \"9ea8269c-9fdb-442a-ae81-5d278f62e768\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" Apr 20 21:55:00.323978 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.323907 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ea8269c-9fdb-442a-ae81-5d278f62e768-apiservice-cert\") pod \"opendatahub-operator-controller-manager-f5f47469b-mdkqw\" (UID: \"9ea8269c-9fdb-442a-ae81-5d278f62e768\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" Apr 20 21:55:00.335602 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.335582 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk424\" (UniqueName: \"kubernetes.io/projected/9ea8269c-9fdb-442a-ae81-5d278f62e768-kube-api-access-mk424\") pod \"opendatahub-operator-controller-manager-f5f47469b-mdkqw\" (UID: \"9ea8269c-9fdb-442a-ae81-5d278f62e768\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" Apr 20 21:55:00.399538 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.399478 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" Apr 20 21:55:00.531828 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:00.531804 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw"] Apr 20 21:55:00.534203 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:55:00.534172 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ea8269c_9fdb_442a_ae81_5d278f62e768.slice/crio-a7a01616b8ad61d817092b7a62d000e46adc98cbeb02fa67f939cd995b0eef6c WatchSource:0}: Error finding container a7a01616b8ad61d817092b7a62d000e46adc98cbeb02fa67f939cd995b0eef6c: Status 404 returned error can't find the container with id a7a01616b8ad61d817092b7a62d000e46adc98cbeb02fa67f939cd995b0eef6c Apr 20 21:55:01.223643 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:01.223582 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" event={"ID":"9ea8269c-9fdb-442a-ae81-5d278f62e768","Type":"ContainerStarted","Data":"a7a01616b8ad61d817092b7a62d000e46adc98cbeb02fa67f939cd995b0eef6c"} Apr 20 21:55:01.225643 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:01.225610 2566 generic.go:358] "Generic (PLEG): container finished" podID="a0513538-6379-48d2-baae-ce01ae943c00" containerID="3e604f8eb5b7829e0e558d45c3258668ec1031765acb94b96072c1ae8adaf402" exitCode=0 Apr 20 21:55:01.225782 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:01.225657 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" event={"ID":"a0513538-6379-48d2-baae-ce01ae943c00","Type":"ContainerDied","Data":"3e604f8eb5b7829e0e558d45c3258668ec1031765acb94b96072c1ae8adaf402"} Apr 20 21:55:02.232047 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:02.232014 2566 generic.go:358] "Generic (PLEG): container finished" podID="a0513538-6379-48d2-baae-ce01ae943c00" containerID="485e7bd6e3edd2abfba52838d78acdf4bb63da1ac071eb4635f85fbe2a07b560" exitCode=0 Apr 20 21:55:02.232436 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:02.232066 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" event={"ID":"a0513538-6379-48d2-baae-ce01ae943c00","Type":"ContainerDied","Data":"485e7bd6e3edd2abfba52838d78acdf4bb63da1ac071eb4635f85fbe2a07b560"} Apr 20 21:55:03.236760 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:03.236722 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" event={"ID":"9ea8269c-9fdb-442a-ae81-5d278f62e768","Type":"ContainerStarted","Data":"c92872386e4071adb4b631866b456fccf10869e4d0eddb7e1727fcf9fd2aa923"} Apr 20 21:55:03.256548 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:03.256506 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" podStartSLOduration=0.749863297 podStartE2EDuration="3.256490954s" podCreationTimestamp="2026-04-20 21:55:00 +0000 UTC" firstStartedPulling="2026-04-20 21:55:00.539467992 +0000 UTC m=+469.348155335" lastFinishedPulling="2026-04-20 21:55:03.046095644 +0000 UTC m=+471.854782992" observedRunningTime="2026-04-20 21:55:03.25505136 +0000 UTC m=+472.063738725" watchObservedRunningTime="2026-04-20 21:55:03.256490954 +0000 UTC m=+472.065178320" Apr 20 21:55:03.360697 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:03.360675 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" Apr 20 21:55:03.450210 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:03.450177 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0513538-6379-48d2-baae-ce01ae943c00-bundle\") pod \"a0513538-6379-48d2-baae-ce01ae943c00\" (UID: \"a0513538-6379-48d2-baae-ce01ae943c00\") " Apr 20 21:55:03.450394 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:03.450228 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0513538-6379-48d2-baae-ce01ae943c00-util\") pod \"a0513538-6379-48d2-baae-ce01ae943c00\" (UID: \"a0513538-6379-48d2-baae-ce01ae943c00\") " Apr 20 21:55:03.450394 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:03.450275 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kzdh\" (UniqueName: \"kubernetes.io/projected/a0513538-6379-48d2-baae-ce01ae943c00-kube-api-access-4kzdh\") pod \"a0513538-6379-48d2-baae-ce01ae943c00\" (UID: \"a0513538-6379-48d2-baae-ce01ae943c00\") " Apr 20 21:55:03.450997 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:03.450965 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0513538-6379-48d2-baae-ce01ae943c00-bundle" (OuterVolumeSpecName: "bundle") pod "a0513538-6379-48d2-baae-ce01ae943c00" (UID: "a0513538-6379-48d2-baae-ce01ae943c00"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:55:03.452458 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:03.452434 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0513538-6379-48d2-baae-ce01ae943c00-kube-api-access-4kzdh" (OuterVolumeSpecName: "kube-api-access-4kzdh") pod "a0513538-6379-48d2-baae-ce01ae943c00" (UID: "a0513538-6379-48d2-baae-ce01ae943c00"). InnerVolumeSpecName "kube-api-access-4kzdh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:55:03.455945 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:03.455921 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0513538-6379-48d2-baae-ce01ae943c00-util" (OuterVolumeSpecName: "util") pod "a0513538-6379-48d2-baae-ce01ae943c00" (UID: "a0513538-6379-48d2-baae-ce01ae943c00"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:55:03.551512 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:03.551471 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0513538-6379-48d2-baae-ce01ae943c00-bundle\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:55:03.551512 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:03.551506 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0513538-6379-48d2-baae-ce01ae943c00-util\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:55:03.551512 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:03.551517 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4kzdh\" (UniqueName: \"kubernetes.io/projected/a0513538-6379-48d2-baae-ce01ae943c00-kube-api-access-4kzdh\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:55:04.241292 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:04.241245 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" event={"ID":"a0513538-6379-48d2-baae-ce01ae943c00","Type":"ContainerDied","Data":"0d691721d27a2fffcf086145a7d6358118f617ed9d993aadea3814f7cc19049a"} Apr 20 21:55:04.241654 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:04.241303 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d691721d27a2fffcf086145a7d6358118f617ed9d993aadea3814f7cc19049a" Apr 20 21:55:04.241654 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:04.241348 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9k24sq" Apr 20 21:55:04.241654 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:04.241417 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" Apr 20 21:55:15.248167 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:15.248138 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-mdkqw" Apr 20 21:55:20.879997 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:20.879966 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-mmcb7"] Apr 20 21:55:20.880355 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:20.880293 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0513538-6379-48d2-baae-ce01ae943c00" containerName="pull" Apr 20 21:55:20.880355 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:20.880306 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0513538-6379-48d2-baae-ce01ae943c00" containerName="pull" Apr 20 21:55:20.880355 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:20.880327 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0513538-6379-48d2-baae-ce01ae943c00" containerName="extract" Apr 20 21:55:20.880355 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:20.880333 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0513538-6379-48d2-baae-ce01ae943c00" containerName="extract" Apr 20 21:55:20.880355 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:20.880340 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0513538-6379-48d2-baae-ce01ae943c00" containerName="util" Apr 20 21:55:20.880355 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:20.880345 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0513538-6379-48d2-baae-ce01ae943c00" containerName="util" Apr 20 21:55:20.880541 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:20.880397 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0513538-6379-48d2-baae-ce01ae943c00" containerName="extract" Apr 20 21:55:20.884921 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:20.884903 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" Apr 20 21:55:20.887128 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:20.887110 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 21:55:20.887403 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:20.887387 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-tgnfs\"" Apr 20 21:55:20.893340 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:20.893318 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-mmcb7"] Apr 20 21:55:20.986915 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:20.986885 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2ddb053-015c-4b09-8687-7878eac0bcc5-cert\") pod \"odh-model-controller-858dbf95b8-mmcb7\" (UID: \"a2ddb053-015c-4b09-8687-7878eac0bcc5\") " pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" Apr 20 21:55:20.987043 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:20.986928 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p5lq\" (UniqueName: \"kubernetes.io/projected/a2ddb053-015c-4b09-8687-7878eac0bcc5-kube-api-access-4p5lq\") pod \"odh-model-controller-858dbf95b8-mmcb7\" (UID: \"a2ddb053-015c-4b09-8687-7878eac0bcc5\") " pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" Apr 20 21:55:21.087626 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:21.087595 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p5lq\" (UniqueName: \"kubernetes.io/projected/a2ddb053-015c-4b09-8687-7878eac0bcc5-kube-api-access-4p5lq\") pod \"odh-model-controller-858dbf95b8-mmcb7\" (UID: \"a2ddb053-015c-4b09-8687-7878eac0bcc5\") " pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" Apr 20 21:55:21.087774 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:21.087669 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2ddb053-015c-4b09-8687-7878eac0bcc5-cert\") pod \"odh-model-controller-858dbf95b8-mmcb7\" (UID: \"a2ddb053-015c-4b09-8687-7878eac0bcc5\") " pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" Apr 20 21:55:21.087774 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:55:21.087761 2566 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 21:55:21.087840 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:55:21.087811 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2ddb053-015c-4b09-8687-7878eac0bcc5-cert podName:a2ddb053-015c-4b09-8687-7878eac0bcc5 nodeName:}" failed. No retries permitted until 2026-04-20 21:55:21.587796552 +0000 UTC m=+490.396483896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a2ddb053-015c-4b09-8687-7878eac0bcc5-cert") pod "odh-model-controller-858dbf95b8-mmcb7" (UID: "a2ddb053-015c-4b09-8687-7878eac0bcc5") : secret "odh-model-controller-webhook-cert" not found Apr 20 21:55:21.096030 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:21.096009 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p5lq\" (UniqueName: \"kubernetes.io/projected/a2ddb053-015c-4b09-8687-7878eac0bcc5-kube-api-access-4p5lq\") pod \"odh-model-controller-858dbf95b8-mmcb7\" (UID: \"a2ddb053-015c-4b09-8687-7878eac0bcc5\") " pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" Apr 20 21:55:21.590816 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:21.590784 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2ddb053-015c-4b09-8687-7878eac0bcc5-cert\") pod \"odh-model-controller-858dbf95b8-mmcb7\" (UID: \"a2ddb053-015c-4b09-8687-7878eac0bcc5\") " pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" Apr 20 21:55:21.591012 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:55:21.590958 2566 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 21:55:21.591082 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:55:21.591035 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2ddb053-015c-4b09-8687-7878eac0bcc5-cert podName:a2ddb053-015c-4b09-8687-7878eac0bcc5 nodeName:}" failed. No retries permitted until 2026-04-20 21:55:22.591014339 +0000 UTC m=+491.399701699 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a2ddb053-015c-4b09-8687-7878eac0bcc5-cert") pod "odh-model-controller-858dbf95b8-mmcb7" (UID: "a2ddb053-015c-4b09-8687-7878eac0bcc5") : secret "odh-model-controller-webhook-cert" not found Apr 20 21:55:22.598965 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:22.598929 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2ddb053-015c-4b09-8687-7878eac0bcc5-cert\") pod \"odh-model-controller-858dbf95b8-mmcb7\" (UID: \"a2ddb053-015c-4b09-8687-7878eac0bcc5\") " pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" Apr 20 21:55:22.601365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:22.601341 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2ddb053-015c-4b09-8687-7878eac0bcc5-cert\") pod \"odh-model-controller-858dbf95b8-mmcb7\" (UID: \"a2ddb053-015c-4b09-8687-7878eac0bcc5\") " pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" Apr 20 21:55:22.696054 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:22.696014 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" Apr 20 21:55:22.813219 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:22.813194 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-mmcb7"] Apr 20 21:55:22.814674 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:55:22.814647 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2ddb053_015c_4b09_8687_7878eac0bcc5.slice/crio-266de1d6b259aa34bc150023c5dca556fb28e7a73da44e947394ab080be11c98 WatchSource:0}: Error finding container 266de1d6b259aa34bc150023c5dca556fb28e7a73da44e947394ab080be11c98: Status 404 returned error can't find the container with id 266de1d6b259aa34bc150023c5dca556fb28e7a73da44e947394ab080be11c98 Apr 20 21:55:23.305489 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:23.305450 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" event={"ID":"a2ddb053-015c-4b09-8687-7878eac0bcc5","Type":"ContainerStarted","Data":"266de1d6b259aa34bc150023c5dca556fb28e7a73da44e947394ab080be11c98"} Apr 20 21:55:26.007471 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.007438 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-7xpgh"] Apr 20 21:55:26.010665 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.010649 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-7xpgh" Apr 20 21:55:26.014680 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.014663 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 21:55:26.014786 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.014770 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-dsc9f\"" Apr 20 21:55:26.024659 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.024635 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-7xpgh"] Apr 20 21:55:26.130648 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.130603 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmptq\" (UniqueName: \"kubernetes.io/projected/1aac7f00-8545-40c9-906a-0719d15b0d78-kube-api-access-qmptq\") pod \"kserve-controller-manager-856948b99f-7xpgh\" (UID: \"1aac7f00-8545-40c9-906a-0719d15b0d78\") " pod="opendatahub/kserve-controller-manager-856948b99f-7xpgh" Apr 20 21:55:26.130804 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.130682 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1aac7f00-8545-40c9-906a-0719d15b0d78-cert\") pod \"kserve-controller-manager-856948b99f-7xpgh\" (UID: \"1aac7f00-8545-40c9-906a-0719d15b0d78\") " pod="opendatahub/kserve-controller-manager-856948b99f-7xpgh" Apr 20 21:55:26.231297 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.231247 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1aac7f00-8545-40c9-906a-0719d15b0d78-cert\") pod \"kserve-controller-manager-856948b99f-7xpgh\" (UID: \"1aac7f00-8545-40c9-906a-0719d15b0d78\") " pod="opendatahub/kserve-controller-manager-856948b99f-7xpgh" Apr 20 21:55:26.231444 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:55:26.231386 2566 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 21:55:26.231444 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:55:26.231443 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1aac7f00-8545-40c9-906a-0719d15b0d78-cert podName:1aac7f00-8545-40c9-906a-0719d15b0d78 nodeName:}" failed. No retries permitted until 2026-04-20 21:55:26.731426911 +0000 UTC m=+495.540114258 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1aac7f00-8545-40c9-906a-0719d15b0d78-cert") pod "kserve-controller-manager-856948b99f-7xpgh" (UID: "1aac7f00-8545-40c9-906a-0719d15b0d78") : secret "kserve-webhook-server-cert" not found Apr 20 21:55:26.231548 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.231384 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmptq\" (UniqueName: \"kubernetes.io/projected/1aac7f00-8545-40c9-906a-0719d15b0d78-kube-api-access-qmptq\") pod \"kserve-controller-manager-856948b99f-7xpgh\" (UID: \"1aac7f00-8545-40c9-906a-0719d15b0d78\") " pod="opendatahub/kserve-controller-manager-856948b99f-7xpgh" Apr 20 21:55:26.241193 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.241169 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmptq\" (UniqueName: \"kubernetes.io/projected/1aac7f00-8545-40c9-906a-0719d15b0d78-kube-api-access-qmptq\") pod \"kserve-controller-manager-856948b99f-7xpgh\" (UID: \"1aac7f00-8545-40c9-906a-0719d15b0d78\") " pod="opendatahub/kserve-controller-manager-856948b99f-7xpgh" Apr 20 21:55:26.318616 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.318546 2566 generic.go:358] "Generic (PLEG): container finished" podID="a2ddb053-015c-4b09-8687-7878eac0bcc5" containerID="c4c02d1a0c9e898993f7d01c0fcef1a7a03a03abb891661087dceb08f025f828" exitCode=1 Apr 20 21:55:26.318616 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.318585 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" event={"ID":"a2ddb053-015c-4b09-8687-7878eac0bcc5","Type":"ContainerDied","Data":"c4c02d1a0c9e898993f7d01c0fcef1a7a03a03abb891661087dceb08f025f828"} Apr 20 21:55:26.318838 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.318824 2566 scope.go:117] "RemoveContainer" containerID="c4c02d1a0c9e898993f7d01c0fcef1a7a03a03abb891661087dceb08f025f828" Apr 20 21:55:26.735978 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.735945 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1aac7f00-8545-40c9-906a-0719d15b0d78-cert\") pod \"kserve-controller-manager-856948b99f-7xpgh\" (UID: \"1aac7f00-8545-40c9-906a-0719d15b0d78\") " pod="opendatahub/kserve-controller-manager-856948b99f-7xpgh" Apr 20 21:55:26.738251 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.738229 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1aac7f00-8545-40c9-906a-0719d15b0d78-cert\") pod \"kserve-controller-manager-856948b99f-7xpgh\" (UID: \"1aac7f00-8545-40c9-906a-0719d15b0d78\") " pod="opendatahub/kserve-controller-manager-856948b99f-7xpgh" Apr 20 21:55:26.921663 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:26.921593 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-7xpgh" Apr 20 21:55:27.060738 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:27.060715 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-7xpgh"] Apr 20 21:55:27.324630 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:27.324592 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" event={"ID":"a2ddb053-015c-4b09-8687-7878eac0bcc5","Type":"ContainerStarted","Data":"f8c49a3e571cb2cb91c76168846e9596b096e72a1ed7325a4b3ff81bba83913e"} Apr 20 21:55:27.324818 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:27.324741 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" Apr 20 21:55:27.325924 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:27.325900 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-7xpgh" event={"ID":"1aac7f00-8545-40c9-906a-0719d15b0d78","Type":"ContainerStarted","Data":"a5e27c7c02c0f8484e49211963aff5607947259a327f970a38fe68a0ff402a77"} Apr 20 21:55:27.342866 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:27.342820 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" podStartSLOduration=3.597013796 podStartE2EDuration="7.342807837s" podCreationTimestamp="2026-04-20 21:55:20 +0000 UTC" firstStartedPulling="2026-04-20 21:55:22.815858142 +0000 UTC m=+491.624545486" lastFinishedPulling="2026-04-20 21:55:26.561652183 +0000 UTC m=+495.370339527" observedRunningTime="2026-04-20 21:55:27.341747403 +0000 UTC m=+496.150434768" watchObservedRunningTime="2026-04-20 21:55:27.342807837 +0000 UTC m=+496.151495201" Apr 20 21:55:28.685609 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.685574 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s"] Apr 20 21:55:28.689188 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.689170 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" Apr 20 21:55:28.692507 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.692484 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5b4bm\"" Apr 20 21:55:28.692620 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.692596 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 21:55:28.693659 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.693643 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 21:55:28.703060 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.703041 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s"] Apr 20 21:55:28.754096 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.754067 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqkrh\" (UniqueName: \"kubernetes.io/projected/48d5a704-b77b-46fe-8c18-1246e15c6ce2-kube-api-access-hqkrh\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s\" (UID: \"48d5a704-b77b-46fe-8c18-1246e15c6ce2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" Apr 20 21:55:28.754267 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.754118 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48d5a704-b77b-46fe-8c18-1246e15c6ce2-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s\" (UID: \"48d5a704-b77b-46fe-8c18-1246e15c6ce2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" Apr 20 21:55:28.754267 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.754233 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48d5a704-b77b-46fe-8c18-1246e15c6ce2-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s\" (UID: \"48d5a704-b77b-46fe-8c18-1246e15c6ce2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" Apr 20 21:55:28.855715 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.855676 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48d5a704-b77b-46fe-8c18-1246e15c6ce2-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s\" (UID: \"48d5a704-b77b-46fe-8c18-1246e15c6ce2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" Apr 20 21:55:28.855881 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.855763 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqkrh\" (UniqueName: \"kubernetes.io/projected/48d5a704-b77b-46fe-8c18-1246e15c6ce2-kube-api-access-hqkrh\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s\" (UID: \"48d5a704-b77b-46fe-8c18-1246e15c6ce2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" Apr 20 21:55:28.855881 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.855799 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48d5a704-b77b-46fe-8c18-1246e15c6ce2-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s\" (UID: \"48d5a704-b77b-46fe-8c18-1246e15c6ce2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" Apr 20 21:55:28.856346 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.856270 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48d5a704-b77b-46fe-8c18-1246e15c6ce2-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s\" (UID: \"48d5a704-b77b-46fe-8c18-1246e15c6ce2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" Apr 20 21:55:28.856492 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.856349 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48d5a704-b77b-46fe-8c18-1246e15c6ce2-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s\" (UID: \"48d5a704-b77b-46fe-8c18-1246e15c6ce2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" Apr 20 21:55:28.858246 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.858226 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz"] Apr 20 21:55:28.861588 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.861573 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz" Apr 20 21:55:28.863987 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.863967 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 21:55:28.864069 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.864017 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 21:55:28.864135 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.864018 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-pdkmt\"" Apr 20 21:55:28.871054 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.871032 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz"] Apr 20 21:55:28.875272 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.875245 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqkrh\" (UniqueName: \"kubernetes.io/projected/48d5a704-b77b-46fe-8c18-1246e15c6ce2-kube-api-access-hqkrh\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s\" (UID: \"48d5a704-b77b-46fe-8c18-1246e15c6ce2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" Apr 20 21:55:28.956815 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.956786 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51b22ece-4292-4ffa-baa4-ff3757373b19-tmp\") pod \"kube-auth-proxy-d7f98b469-5hqqz\" (UID: \"51b22ece-4292-4ffa-baa4-ff3757373b19\") " pod="openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz" Apr 20 21:55:28.956949 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.956819 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51b22ece-4292-4ffa-baa4-ff3757373b19-tls-certs\") pod \"kube-auth-proxy-d7f98b469-5hqqz\" (UID: \"51b22ece-4292-4ffa-baa4-ff3757373b19\") " pod="openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz" Apr 20 21:55:28.956949 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.956850 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9jx6\" (UniqueName: \"kubernetes.io/projected/51b22ece-4292-4ffa-baa4-ff3757373b19-kube-api-access-t9jx6\") pod \"kube-auth-proxy-d7f98b469-5hqqz\" (UID: \"51b22ece-4292-4ffa-baa4-ff3757373b19\") " pod="openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz" Apr 20 21:55:28.998696 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:28.998670 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" Apr 20 21:55:29.058743 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:29.058454 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51b22ece-4292-4ffa-baa4-ff3757373b19-tmp\") pod \"kube-auth-proxy-d7f98b469-5hqqz\" (UID: \"51b22ece-4292-4ffa-baa4-ff3757373b19\") " pod="openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz" Apr 20 21:55:29.058743 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:29.058508 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51b22ece-4292-4ffa-baa4-ff3757373b19-tls-certs\") pod \"kube-auth-proxy-d7f98b469-5hqqz\" (UID: \"51b22ece-4292-4ffa-baa4-ff3757373b19\") " pod="openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz" Apr 20 21:55:29.058743 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:29.058561 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9jx6\" (UniqueName: \"kubernetes.io/projected/51b22ece-4292-4ffa-baa4-ff3757373b19-kube-api-access-t9jx6\") pod \"kube-auth-proxy-d7f98b469-5hqqz\" (UID: \"51b22ece-4292-4ffa-baa4-ff3757373b19\") " pod="openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz" Apr 20 21:55:29.061893 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:29.061839 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51b22ece-4292-4ffa-baa4-ff3757373b19-tls-certs\") pod \"kube-auth-proxy-d7f98b469-5hqqz\" (UID: \"51b22ece-4292-4ffa-baa4-ff3757373b19\") " pod="openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz" Apr 20 21:55:29.062006 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:29.061889 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51b22ece-4292-4ffa-baa4-ff3757373b19-tmp\") pod \"kube-auth-proxy-d7f98b469-5hqqz\" (UID: \"51b22ece-4292-4ffa-baa4-ff3757373b19\") " pod="openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz" Apr 20 21:55:29.067789 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:29.067742 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9jx6\" (UniqueName: \"kubernetes.io/projected/51b22ece-4292-4ffa-baa4-ff3757373b19-kube-api-access-t9jx6\") pod \"kube-auth-proxy-d7f98b469-5hqqz\" (UID: \"51b22ece-4292-4ffa-baa4-ff3757373b19\") " pod="openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz" Apr 20 21:55:29.142266 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:29.142243 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s"] Apr 20 21:55:29.144319 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:55:29.144257 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48d5a704_b77b_46fe_8c18_1246e15c6ce2.slice/crio-033d2e727113ffa176a348fd486aae7a72137f27456c74b199353073cda7bd13 WatchSource:0}: Error finding container 033d2e727113ffa176a348fd486aae7a72137f27456c74b199353073cda7bd13: Status 404 returned error can't find the container with id 033d2e727113ffa176a348fd486aae7a72137f27456c74b199353073cda7bd13 Apr 20 21:55:29.172136 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:29.172109 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz" Apr 20 21:55:29.315346 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:29.315318 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz"] Apr 20 21:55:29.336270 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:29.336205 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" event={"ID":"48d5a704-b77b-46fe-8c18-1246e15c6ce2","Type":"ContainerStarted","Data":"633c2a64f89599a6001ec0ddfe1ccaa3928a3a67891c382eee8ecbe0f172e0c6"} Apr 20 21:55:29.336270 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:29.336244 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" event={"ID":"48d5a704-b77b-46fe-8c18-1246e15c6ce2","Type":"ContainerStarted","Data":"033d2e727113ffa176a348fd486aae7a72137f27456c74b199353073cda7bd13"} Apr 20 21:55:29.694670 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:55:29.694590 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51b22ece_4292_4ffa_baa4_ff3757373b19.slice/crio-d9087c448d658d75fc332828aa4ef434fa63a38ef19833250666af9d3dd4dc64 WatchSource:0}: Error finding container d9087c448d658d75fc332828aa4ef434fa63a38ef19833250666af9d3dd4dc64: Status 404 returned error can't find the container with id d9087c448d658d75fc332828aa4ef434fa63a38ef19833250666af9d3dd4dc64 Apr 20 21:55:30.341795 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:30.341757 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz" event={"ID":"51b22ece-4292-4ffa-baa4-ff3757373b19","Type":"ContainerStarted","Data":"d9087c448d658d75fc332828aa4ef434fa63a38ef19833250666af9d3dd4dc64"} Apr 20 21:55:30.343990 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:30.343936 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-7xpgh" event={"ID":"1aac7f00-8545-40c9-906a-0719d15b0d78","Type":"ContainerStarted","Data":"ae44d20dabc328f51293fecc3e60535c1d0d6d21cc0887f491e622dec36761fe"} Apr 20 21:55:30.344134 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:30.344038 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-7xpgh" Apr 20 21:55:30.345541 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:30.345510 2566 generic.go:358] "Generic (PLEG): container finished" podID="48d5a704-b77b-46fe-8c18-1246e15c6ce2" containerID="633c2a64f89599a6001ec0ddfe1ccaa3928a3a67891c382eee8ecbe0f172e0c6" exitCode=0 Apr 20 21:55:30.345650 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:30.345550 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" event={"ID":"48d5a704-b77b-46fe-8c18-1246e15c6ce2","Type":"ContainerDied","Data":"633c2a64f89599a6001ec0ddfe1ccaa3928a3a67891c382eee8ecbe0f172e0c6"} Apr 20 21:55:30.365175 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:30.365121 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-7xpgh" podStartSLOduration=2.680542671 podStartE2EDuration="5.365105035s" podCreationTimestamp="2026-04-20 21:55:25 +0000 UTC" firstStartedPulling="2026-04-20 21:55:27.058153832 +0000 UTC m=+495.866841178" lastFinishedPulling="2026-04-20 21:55:29.742716195 +0000 UTC m=+498.551403542" observedRunningTime="2026-04-20 21:55:30.36299539 +0000 UTC m=+499.171682758" watchObservedRunningTime="2026-04-20 21:55:30.365105035 +0000 UTC m=+499.173792410" Apr 20 21:55:32.357204 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:32.357167 2566 generic.go:358] "Generic (PLEG): container finished" podID="48d5a704-b77b-46fe-8c18-1246e15c6ce2" containerID="f49079a6a45326f88eca95a67240cac1b146f0b048b948d8bf364c17da25add7" exitCode=0 Apr 20 21:55:32.357677 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:32.357306 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" event={"ID":"48d5a704-b77b-46fe-8c18-1246e15c6ce2","Type":"ContainerDied","Data":"f49079a6a45326f88eca95a67240cac1b146f0b048b948d8bf364c17da25add7"} Apr 20 21:55:33.362335 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:33.362299 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz" event={"ID":"51b22ece-4292-4ffa-baa4-ff3757373b19","Type":"ContainerStarted","Data":"24dd8e1a7913db5ae7a13db2538b9bdec1a2abc67a6ed6938eefc3b418160d9b"} Apr 20 21:55:33.363993 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:33.363970 2566 generic.go:358] "Generic (PLEG): container finished" podID="48d5a704-b77b-46fe-8c18-1246e15c6ce2" containerID="3a6a9ff45854619ad4d7be2c7408bc4b6183074eafa42687ccbcfdd6b44ad89e" exitCode=0 Apr 20 21:55:33.364107 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:33.364087 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" event={"ID":"48d5a704-b77b-46fe-8c18-1246e15c6ce2","Type":"ContainerDied","Data":"3a6a9ff45854619ad4d7be2c7408bc4b6183074eafa42687ccbcfdd6b44ad89e"} Apr 20 21:55:33.379574 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:33.379529 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-d7f98b469-5hqqz" podStartSLOduration=2.527468474 podStartE2EDuration="5.379516985s" podCreationTimestamp="2026-04-20 21:55:28 +0000 UTC" firstStartedPulling="2026-04-20 21:55:29.696346279 +0000 UTC m=+498.505033622" lastFinishedPulling="2026-04-20 21:55:32.548394787 +0000 UTC m=+501.357082133" observedRunningTime="2026-04-20 21:55:33.377783701 +0000 UTC m=+502.186471066" watchObservedRunningTime="2026-04-20 21:55:33.379516985 +0000 UTC m=+502.188204349" Apr 20 21:55:34.489906 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:34.489882 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" Apr 20 21:55:34.608255 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:34.608225 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48d5a704-b77b-46fe-8c18-1246e15c6ce2-util\") pod \"48d5a704-b77b-46fe-8c18-1246e15c6ce2\" (UID: \"48d5a704-b77b-46fe-8c18-1246e15c6ce2\") " Apr 20 21:55:34.608441 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:34.608327 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48d5a704-b77b-46fe-8c18-1246e15c6ce2-bundle\") pod \"48d5a704-b77b-46fe-8c18-1246e15c6ce2\" (UID: \"48d5a704-b77b-46fe-8c18-1246e15c6ce2\") " Apr 20 21:55:34.608441 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:34.608352 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqkrh\" (UniqueName: \"kubernetes.io/projected/48d5a704-b77b-46fe-8c18-1246e15c6ce2-kube-api-access-hqkrh\") pod \"48d5a704-b77b-46fe-8c18-1246e15c6ce2\" (UID: \"48d5a704-b77b-46fe-8c18-1246e15c6ce2\") " Apr 20 21:55:34.609152 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:34.609115 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48d5a704-b77b-46fe-8c18-1246e15c6ce2-bundle" (OuterVolumeSpecName: "bundle") pod "48d5a704-b77b-46fe-8c18-1246e15c6ce2" (UID: "48d5a704-b77b-46fe-8c18-1246e15c6ce2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:55:34.610370 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:34.610341 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48d5a704-b77b-46fe-8c18-1246e15c6ce2-kube-api-access-hqkrh" (OuterVolumeSpecName: "kube-api-access-hqkrh") pod "48d5a704-b77b-46fe-8c18-1246e15c6ce2" (UID: "48d5a704-b77b-46fe-8c18-1246e15c6ce2"). InnerVolumeSpecName "kube-api-access-hqkrh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:55:34.701892 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:34.701825 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48d5a704-b77b-46fe-8c18-1246e15c6ce2-util" (OuterVolumeSpecName: "util") pod "48d5a704-b77b-46fe-8c18-1246e15c6ce2" (UID: "48d5a704-b77b-46fe-8c18-1246e15c6ce2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:55:34.708970 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:34.708945 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48d5a704-b77b-46fe-8c18-1246e15c6ce2-bundle\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:55:34.709045 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:34.708973 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hqkrh\" (UniqueName: \"kubernetes.io/projected/48d5a704-b77b-46fe-8c18-1246e15c6ce2-kube-api-access-hqkrh\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:55:34.709045 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:34.708982 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48d5a704-b77b-46fe-8c18-1246e15c6ce2-util\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:55:35.372235 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:35.372202 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" event={"ID":"48d5a704-b77b-46fe-8c18-1246e15c6ce2","Type":"ContainerDied","Data":"033d2e727113ffa176a348fd486aae7a72137f27456c74b199353073cda7bd13"} Apr 20 21:55:35.372235 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:35.372228 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w9r9s" Apr 20 21:55:35.372235 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:35.372237 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="033d2e727113ffa176a348fd486aae7a72137f27456c74b199353073cda7bd13" Apr 20 21:55:38.332230 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:38.332202 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-mmcb7" Apr 20 21:55:42.839846 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.839816 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7"] Apr 20 21:55:42.840212 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.840198 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48d5a704-b77b-46fe-8c18-1246e15c6ce2" containerName="extract" Apr 20 21:55:42.840256 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.840216 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="48d5a704-b77b-46fe-8c18-1246e15c6ce2" containerName="extract" Apr 20 21:55:42.840256 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.840234 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48d5a704-b77b-46fe-8c18-1246e15c6ce2" containerName="util" Apr 20 21:55:42.840256 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.840240 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="48d5a704-b77b-46fe-8c18-1246e15c6ce2" containerName="util" Apr 20 21:55:42.840256 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.840257 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48d5a704-b77b-46fe-8c18-1246e15c6ce2" containerName="pull" Apr 20 21:55:42.840398 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.840262 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="48d5a704-b77b-46fe-8c18-1246e15c6ce2" containerName="pull" Apr 20 21:55:42.840398 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.840353 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="48d5a704-b77b-46fe-8c18-1246e15c6ce2" containerName="extract" Apr 20 21:55:42.848505 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.848486 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" Apr 20 21:55:42.851336 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.851309 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 21:55:42.852529 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.852506 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5b4bm\"" Apr 20 21:55:42.852664 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.852517 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 21:55:42.855366 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.855347 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7"] Apr 20 21:55:42.976832 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.976805 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7\" (UID: \"82e38521-d7dc-4da4-88a7-320f0fc7f0ca\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" Apr 20 21:55:42.976961 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.976840 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trqx8\" (UniqueName: \"kubernetes.io/projected/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-kube-api-access-trqx8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7\" (UID: \"82e38521-d7dc-4da4-88a7-320f0fc7f0ca\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" Apr 20 21:55:42.976961 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:42.976875 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7\" (UID: \"82e38521-d7dc-4da4-88a7-320f0fc7f0ca\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" Apr 20 21:55:43.077873 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:43.077831 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7\" (UID: \"82e38521-d7dc-4da4-88a7-320f0fc7f0ca\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" Apr 20 21:55:43.077873 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:43.077878 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trqx8\" (UniqueName: \"kubernetes.io/projected/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-kube-api-access-trqx8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7\" (UID: \"82e38521-d7dc-4da4-88a7-320f0fc7f0ca\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" Apr 20 21:55:43.078048 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:43.077920 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7\" (UID: \"82e38521-d7dc-4da4-88a7-320f0fc7f0ca\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" Apr 20 21:55:43.078273 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:43.078252 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7\" (UID: \"82e38521-d7dc-4da4-88a7-320f0fc7f0ca\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" Apr 20 21:55:43.078342 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:43.078275 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7\" (UID: \"82e38521-d7dc-4da4-88a7-320f0fc7f0ca\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" Apr 20 21:55:43.086826 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:43.086800 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trqx8\" (UniqueName: \"kubernetes.io/projected/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-kube-api-access-trqx8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7\" (UID: \"82e38521-d7dc-4da4-88a7-320f0fc7f0ca\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" Apr 20 21:55:43.159947 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:43.159881 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" Apr 20 21:55:43.287737 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:43.287713 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7"] Apr 20 21:55:43.289737 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:55:43.289703 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82e38521_d7dc_4da4_88a7_320f0fc7f0ca.slice/crio-02d95d4e10c813d69e08056e0957c5e29c3a1c1b8a536bddc826809bbaf6b8ae WatchSource:0}: Error finding container 02d95d4e10c813d69e08056e0957c5e29c3a1c1b8a536bddc826809bbaf6b8ae: Status 404 returned error can't find the container with id 02d95d4e10c813d69e08056e0957c5e29c3a1c1b8a536bddc826809bbaf6b8ae Apr 20 21:55:43.403496 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:43.403468 2566 generic.go:358] "Generic (PLEG): container finished" podID="82e38521-d7dc-4da4-88a7-320f0fc7f0ca" containerID="a71c5628a9aee91466e7cf9cc80499318157c5549df655e5c1e60630424d296b" exitCode=0 Apr 20 21:55:43.403604 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:43.403517 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" event={"ID":"82e38521-d7dc-4da4-88a7-320f0fc7f0ca","Type":"ContainerDied","Data":"a71c5628a9aee91466e7cf9cc80499318157c5549df655e5c1e60630424d296b"} Apr 20 21:55:43.403604 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:43.403538 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" event={"ID":"82e38521-d7dc-4da4-88a7-320f0fc7f0ca","Type":"ContainerStarted","Data":"02d95d4e10c813d69e08056e0957c5e29c3a1c1b8a536bddc826809bbaf6b8ae"} Apr 20 21:55:44.309348 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.309318 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-mwch4"] Apr 20 21:55:44.313859 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.313840 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mwch4" Apr 20 21:55:44.316910 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.316890 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 21:55:44.317016 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.316915 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 21:55:44.317016 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.316915 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-xpzmn\"" Apr 20 21:55:44.325875 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.325851 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-mwch4"] Apr 20 21:55:44.388340 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.388314 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/4e20e132-4460-4163-80d7-3606885b20bd-operator-config\") pod \"servicemesh-operator3-55f49c5f94-mwch4\" (UID: \"4e20e132-4460-4163-80d7-3606885b20bd\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mwch4" Apr 20 21:55:44.388457 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.388365 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s56qm\" (UniqueName: \"kubernetes.io/projected/4e20e132-4460-4163-80d7-3606885b20bd-kube-api-access-s56qm\") pod \"servicemesh-operator3-55f49c5f94-mwch4\" (UID: \"4e20e132-4460-4163-80d7-3606885b20bd\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mwch4" Apr 20 21:55:44.408764 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.408733 2566 generic.go:358] "Generic (PLEG): container finished" podID="82e38521-d7dc-4da4-88a7-320f0fc7f0ca" containerID="a73c610df4eaafa3c55e569c4c72647c101643671b117b0e057b142797928130" exitCode=0 Apr 20 21:55:44.408858 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.408813 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" event={"ID":"82e38521-d7dc-4da4-88a7-320f0fc7f0ca","Type":"ContainerDied","Data":"a73c610df4eaafa3c55e569c4c72647c101643671b117b0e057b142797928130"} Apr 20 21:55:44.488922 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.488899 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/4e20e132-4460-4163-80d7-3606885b20bd-operator-config\") pod \"servicemesh-operator3-55f49c5f94-mwch4\" (UID: \"4e20e132-4460-4163-80d7-3606885b20bd\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mwch4" Apr 20 21:55:44.489041 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.488946 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s56qm\" (UniqueName: \"kubernetes.io/projected/4e20e132-4460-4163-80d7-3606885b20bd-kube-api-access-s56qm\") pod \"servicemesh-operator3-55f49c5f94-mwch4\" (UID: \"4e20e132-4460-4163-80d7-3606885b20bd\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mwch4" Apr 20 21:55:44.491595 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.491572 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/4e20e132-4460-4163-80d7-3606885b20bd-operator-config\") pod \"servicemesh-operator3-55f49c5f94-mwch4\" (UID: \"4e20e132-4460-4163-80d7-3606885b20bd\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mwch4" Apr 20 21:55:44.497168 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.497148 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s56qm\" (UniqueName: \"kubernetes.io/projected/4e20e132-4460-4163-80d7-3606885b20bd-kube-api-access-s56qm\") pod \"servicemesh-operator3-55f49c5f94-mwch4\" (UID: \"4e20e132-4460-4163-80d7-3606885b20bd\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mwch4" Apr 20 21:55:44.622877 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.622851 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mwch4" Apr 20 21:55:44.744738 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:44.744707 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-mwch4"] Apr 20 21:55:44.747346 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:55:44.747313 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e20e132_4460_4163_80d7_3606885b20bd.slice/crio-8b3ee05217a8fd243e3c5a257bc5fc9314febb6907e3c8d7b95e821593ddfe50 WatchSource:0}: Error finding container 8b3ee05217a8fd243e3c5a257bc5fc9314febb6907e3c8d7b95e821593ddfe50: Status 404 returned error can't find the container with id 8b3ee05217a8fd243e3c5a257bc5fc9314febb6907e3c8d7b95e821593ddfe50 Apr 20 21:55:45.413943 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:45.413849 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mwch4" event={"ID":"4e20e132-4460-4163-80d7-3606885b20bd","Type":"ContainerStarted","Data":"8b3ee05217a8fd243e3c5a257bc5fc9314febb6907e3c8d7b95e821593ddfe50"} Apr 20 21:55:45.415478 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:45.415450 2566 generic.go:358] "Generic (PLEG): container finished" podID="82e38521-d7dc-4da4-88a7-320f0fc7f0ca" containerID="d79e3244f1bb0945d8e2ebea13f6621048babd5615da5e96abe3c49b3b064f9d" exitCode=0 Apr 20 21:55:45.415603 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:45.415530 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" event={"ID":"82e38521-d7dc-4da4-88a7-320f0fc7f0ca","Type":"ContainerDied","Data":"d79e3244f1bb0945d8e2ebea13f6621048babd5615da5e96abe3c49b3b064f9d"} Apr 20 21:55:46.561779 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:46.561755 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" Apr 20 21:55:46.709193 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:46.709161 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trqx8\" (UniqueName: \"kubernetes.io/projected/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-kube-api-access-trqx8\") pod \"82e38521-d7dc-4da4-88a7-320f0fc7f0ca\" (UID: \"82e38521-d7dc-4da4-88a7-320f0fc7f0ca\") " Apr 20 21:55:46.709382 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:46.709347 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-bundle\") pod \"82e38521-d7dc-4da4-88a7-320f0fc7f0ca\" (UID: \"82e38521-d7dc-4da4-88a7-320f0fc7f0ca\") " Apr 20 21:55:46.709450 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:46.709406 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-util\") pod \"82e38521-d7dc-4da4-88a7-320f0fc7f0ca\" (UID: \"82e38521-d7dc-4da4-88a7-320f0fc7f0ca\") " Apr 20 21:55:46.710443 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:46.710414 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-bundle" (OuterVolumeSpecName: "bundle") pod "82e38521-d7dc-4da4-88a7-320f0fc7f0ca" (UID: "82e38521-d7dc-4da4-88a7-320f0fc7f0ca"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:55:46.711884 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:46.711853 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-kube-api-access-trqx8" (OuterVolumeSpecName: "kube-api-access-trqx8") pod "82e38521-d7dc-4da4-88a7-320f0fc7f0ca" (UID: "82e38521-d7dc-4da4-88a7-320f0fc7f0ca"). InnerVolumeSpecName "kube-api-access-trqx8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:55:46.715777 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:46.715740 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-util" (OuterVolumeSpecName: "util") pod "82e38521-d7dc-4da4-88a7-320f0fc7f0ca" (UID: "82e38521-d7dc-4da4-88a7-320f0fc7f0ca"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:55:46.810708 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:46.810669 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-util\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:55:46.810708 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:46.810698 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trqx8\" (UniqueName: \"kubernetes.io/projected/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-kube-api-access-trqx8\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:55:46.810708 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:46.810713 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82e38521-d7dc-4da4-88a7-320f0fc7f0ca-bundle\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:55:47.425190 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:47.425161 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" Apr 20 21:55:47.425370 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:47.425161 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c286pb7" event={"ID":"82e38521-d7dc-4da4-88a7-320f0fc7f0ca","Type":"ContainerDied","Data":"02d95d4e10c813d69e08056e0957c5e29c3a1c1b8a536bddc826809bbaf6b8ae"} Apr 20 21:55:47.425370 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:47.425301 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02d95d4e10c813d69e08056e0957c5e29c3a1c1b8a536bddc826809bbaf6b8ae" Apr 20 21:55:48.430462 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:48.430429 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mwch4" event={"ID":"4e20e132-4460-4163-80d7-3606885b20bd","Type":"ContainerStarted","Data":"964f936f9acde359b9b45c986ae9e44ad8a78511d50845dad74ebb807dc5aa9e"} Apr 20 21:55:48.430904 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:48.430549 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mwch4" Apr 20 21:55:48.456557 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:48.456516 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mwch4" podStartSLOduration=1.575990352 podStartE2EDuration="4.456503062s" podCreationTimestamp="2026-04-20 21:55:44 +0000 UTC" firstStartedPulling="2026-04-20 21:55:44.74985697 +0000 UTC m=+513.558544314" lastFinishedPulling="2026-04-20 21:55:47.630369682 +0000 UTC m=+516.439057024" observedRunningTime="2026-04-20 21:55:48.453032955 +0000 UTC m=+517.261720324" watchObservedRunningTime="2026-04-20 21:55:48.456503062 +0000 UTC m=+517.265190428" Apr 20 21:55:59.437553 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.437519 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mwch4" Apr 20 21:55:59.528326 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.528296 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z"] Apr 20 21:55:59.528656 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.528643 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82e38521-d7dc-4da4-88a7-320f0fc7f0ca" containerName="util" Apr 20 21:55:59.528699 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.528657 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="82e38521-d7dc-4da4-88a7-320f0fc7f0ca" containerName="util" Apr 20 21:55:59.528699 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.528665 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82e38521-d7dc-4da4-88a7-320f0fc7f0ca" containerName="extract" Apr 20 21:55:59.528699 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.528670 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="82e38521-d7dc-4da4-88a7-320f0fc7f0ca" containerName="extract" Apr 20 21:55:59.528699 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.528682 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82e38521-d7dc-4da4-88a7-320f0fc7f0ca" containerName="pull" Apr 20 21:55:59.528699 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.528687 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="82e38521-d7dc-4da4-88a7-320f0fc7f0ca" containerName="pull" Apr 20 21:55:59.528849 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.528751 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="82e38521-d7dc-4da4-88a7-320f0fc7f0ca" containerName="extract" Apr 20 21:55:59.582393 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.582363 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z"] Apr 20 21:55:59.582544 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.582496 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.585117 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.585093 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 21:55:59.585295 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.585174 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 21:55:59.585295 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.585222 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 21:55:59.585295 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.585272 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 21:55:59.585448 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.585309 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-n2lsn\"" Apr 20 21:55:59.713339 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.713301 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/bd180856-1de8-453a-9572-dac5318b40fb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.713339 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.713338 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/bd180856-1de8-453a-9572-dac5318b40fb-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.713549 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.713356 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bd180856-1de8-453a-9572-dac5318b40fb-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.713549 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.713417 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/bd180856-1de8-453a-9572-dac5318b40fb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.713549 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.713489 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhfw8\" (UniqueName: \"kubernetes.io/projected/bd180856-1de8-453a-9572-dac5318b40fb-kube-api-access-xhfw8\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.713549 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.713526 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/bd180856-1de8-453a-9572-dac5318b40fb-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.713703 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.713570 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd180856-1de8-453a-9572-dac5318b40fb-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.814845 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.814818 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/bd180856-1de8-453a-9572-dac5318b40fb-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.815022 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.814858 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd180856-1de8-453a-9572-dac5318b40fb-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.815022 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.814904 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/bd180856-1de8-453a-9572-dac5318b40fb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.815022 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.814932 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/bd180856-1de8-453a-9572-dac5318b40fb-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.815022 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.814956 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bd180856-1de8-453a-9572-dac5318b40fb-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.815022 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.814994 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/bd180856-1de8-453a-9572-dac5318b40fb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.815267 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.815041 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhfw8\" (UniqueName: \"kubernetes.io/projected/bd180856-1de8-453a-9572-dac5318b40fb-kube-api-access-xhfw8\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.815748 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.815720 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/bd180856-1de8-453a-9572-dac5318b40fb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.817428 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.817399 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/bd180856-1de8-453a-9572-dac5318b40fb-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.817592 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.817567 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/bd180856-1de8-453a-9572-dac5318b40fb-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.817739 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.817720 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/bd180856-1de8-453a-9572-dac5318b40fb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.817812 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.817793 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd180856-1de8-453a-9572-dac5318b40fb-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.827940 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.827919 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bd180856-1de8-453a-9572-dac5318b40fb-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.828346 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.828328 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhfw8\" (UniqueName: \"kubernetes.io/projected/bd180856-1de8-453a-9572-dac5318b40fb-kube-api-access-xhfw8\") pod \"istiod-openshift-gateway-55ff986f96-bk64z\" (UID: \"bd180856-1de8-453a-9572-dac5318b40fb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:55:59.892101 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:55:59.892074 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:56:00.061730 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:00.061595 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z"] Apr 20 21:56:00.063738 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:56:00.063702 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd180856_1de8_453a_9572_dac5318b40fb.slice/crio-cec88270d92f9e9720510fdd75acefa410061f16d1761d470b4b5c399b384c28 WatchSource:0}: Error finding container cec88270d92f9e9720510fdd75acefa410061f16d1761d470b4b5c399b384c28: Status 404 returned error can't find the container with id cec88270d92f9e9720510fdd75acefa410061f16d1761d470b4b5c399b384c28 Apr 20 21:56:00.483743 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:00.483707 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" event={"ID":"bd180856-1de8-453a-9572-dac5318b40fb","Type":"ContainerStarted","Data":"cec88270d92f9e9720510fdd75acefa410061f16d1761d470b4b5c399b384c28"} Apr 20 21:56:01.356622 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:01.356590 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-7xpgh" Apr 20 21:56:02.600923 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:02.600884 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 20 21:56:02.601307 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:02.600965 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 20 21:56:03.499023 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:03.498979 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" event={"ID":"bd180856-1de8-453a-9572-dac5318b40fb","Type":"ContainerStarted","Data":"c280a978608397e1d2cf58a7d1b0c82bcec7126f043d0a4e7a6bd74e98658369"} Apr 20 21:56:03.499221 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:03.499186 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:56:03.501136 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:03.501112 2566 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-bk64z container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 21:56:03.501267 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:03.501179 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" podUID="bd180856-1de8-453a-9572-dac5318b40fb" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 21:56:03.521184 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:03.521122 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" podStartSLOduration=1.9861047630000002 podStartE2EDuration="4.521105558s" podCreationTimestamp="2026-04-20 21:55:59 +0000 UTC" firstStartedPulling="2026-04-20 21:56:00.065619108 +0000 UTC m=+528.874306451" lastFinishedPulling="2026-04-20 21:56:02.600619899 +0000 UTC m=+531.409307246" observedRunningTime="2026-04-20 21:56:03.519233233 +0000 UTC m=+532.327920600" watchObservedRunningTime="2026-04-20 21:56:03.521105558 +0000 UTC m=+532.329792925" Apr 20 21:56:04.502784 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:04.502758 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bk64z" Apr 20 21:56:36.692132 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:36.692095 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr"] Apr 20 21:56:36.697846 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:36.697827 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" Apr 20 21:56:36.700874 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:36.700853 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-pljh2\"" Apr 20 21:56:36.701151 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:36.700873 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 21:56:36.701151 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:36.700854 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 21:56:36.702106 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:36.702083 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr"] Apr 20 21:56:36.800912 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:36.800874 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09a177d4-c3f7-4071-9617-cab492afb928-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr\" (UID: \"09a177d4-c3f7-4071-9617-cab492afb928\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" Apr 20 21:56:36.801071 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:36.801010 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm62p\" (UniqueName: \"kubernetes.io/projected/09a177d4-c3f7-4071-9617-cab492afb928-kube-api-access-pm62p\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr\" (UID: \"09a177d4-c3f7-4071-9617-cab492afb928\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" Apr 20 21:56:36.801125 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:36.801066 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09a177d4-c3f7-4071-9617-cab492afb928-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr\" (UID: \"09a177d4-c3f7-4071-9617-cab492afb928\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" Apr 20 21:56:36.901698 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:36.901670 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pm62p\" (UniqueName: \"kubernetes.io/projected/09a177d4-c3f7-4071-9617-cab492afb928-kube-api-access-pm62p\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr\" (UID: \"09a177d4-c3f7-4071-9617-cab492afb928\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" Apr 20 21:56:36.901698 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:36.901701 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09a177d4-c3f7-4071-9617-cab492afb928-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr\" (UID: \"09a177d4-c3f7-4071-9617-cab492afb928\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" Apr 20 21:56:36.901909 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:36.901738 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09a177d4-c3f7-4071-9617-cab492afb928-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr\" (UID: \"09a177d4-c3f7-4071-9617-cab492afb928\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" Apr 20 21:56:36.902103 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:36.902085 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09a177d4-c3f7-4071-9617-cab492afb928-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr\" (UID: \"09a177d4-c3f7-4071-9617-cab492afb928\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" Apr 20 21:56:36.902155 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:36.902110 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09a177d4-c3f7-4071-9617-cab492afb928-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr\" (UID: \"09a177d4-c3f7-4071-9617-cab492afb928\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" Apr 20 21:56:36.909665 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:36.909635 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm62p\" (UniqueName: \"kubernetes.io/projected/09a177d4-c3f7-4071-9617-cab492afb928-kube-api-access-pm62p\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr\" (UID: \"09a177d4-c3f7-4071-9617-cab492afb928\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" Apr 20 21:56:37.008047 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.008022 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" Apr 20 21:56:37.081197 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.081169 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj"] Apr 20 21:56:37.085059 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.084992 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" Apr 20 21:56:37.094804 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.094776 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj"] Apr 20 21:56:37.103526 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.103497 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea440a99-84be-442d-a9c0-eb81422af518-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj\" (UID: \"ea440a99-84be-442d-a9c0-eb81422af518\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" Apr 20 21:56:37.103647 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.103553 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbr57\" (UniqueName: \"kubernetes.io/projected/ea440a99-84be-442d-a9c0-eb81422af518-kube-api-access-lbr57\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj\" (UID: \"ea440a99-84be-442d-a9c0-eb81422af518\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" Apr 20 21:56:37.103740 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.103722 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea440a99-84be-442d-a9c0-eb81422af518-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj\" (UID: \"ea440a99-84be-442d-a9c0-eb81422af518\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" Apr 20 21:56:37.131173 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.131145 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr"] Apr 20 21:56:37.133214 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:56:37.133179 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a177d4_c3f7_4071_9617_cab492afb928.slice/crio-f46e162dc39eeda1ca624402eb6dc75f967b686395f5181c3841d6473c74ac6f WatchSource:0}: Error finding container f46e162dc39eeda1ca624402eb6dc75f967b686395f5181c3841d6473c74ac6f: Status 404 returned error can't find the container with id f46e162dc39eeda1ca624402eb6dc75f967b686395f5181c3841d6473c74ac6f Apr 20 21:56:37.204560 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.204538 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea440a99-84be-442d-a9c0-eb81422af518-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj\" (UID: \"ea440a99-84be-442d-a9c0-eb81422af518\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" Apr 20 21:56:37.204688 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.204577 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea440a99-84be-442d-a9c0-eb81422af518-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj\" (UID: \"ea440a99-84be-442d-a9c0-eb81422af518\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" Apr 20 21:56:37.204688 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.204606 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbr57\" (UniqueName: \"kubernetes.io/projected/ea440a99-84be-442d-a9c0-eb81422af518-kube-api-access-lbr57\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj\" (UID: \"ea440a99-84be-442d-a9c0-eb81422af518\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" Apr 20 21:56:37.204898 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.204876 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea440a99-84be-442d-a9c0-eb81422af518-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj\" (UID: \"ea440a99-84be-442d-a9c0-eb81422af518\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" Apr 20 21:56:37.204961 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.204912 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea440a99-84be-442d-a9c0-eb81422af518-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj\" (UID: \"ea440a99-84be-442d-a9c0-eb81422af518\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" Apr 20 21:56:37.212459 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.212438 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbr57\" (UniqueName: \"kubernetes.io/projected/ea440a99-84be-442d-a9c0-eb81422af518-kube-api-access-lbr57\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj\" (UID: \"ea440a99-84be-442d-a9c0-eb81422af518\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" Apr 20 21:56:37.399400 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.399334 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" Apr 20 21:56:37.484349 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.484316 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z"] Apr 20 21:56:37.487682 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.487661 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" Apr 20 21:56:37.495940 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.495748 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z"] Apr 20 21:56:37.506920 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.506891 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f757t\" (UniqueName: \"kubernetes.io/projected/49fd39c2-5908-43b7-9079-8c696fe2d198-kube-api-access-f757t\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z\" (UID: \"49fd39c2-5908-43b7-9079-8c696fe2d198\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" Apr 20 21:56:37.507068 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.506941 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/49fd39c2-5908-43b7-9079-8c696fe2d198-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z\" (UID: \"49fd39c2-5908-43b7-9079-8c696fe2d198\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" Apr 20 21:56:37.507068 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.506989 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/49fd39c2-5908-43b7-9079-8c696fe2d198-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z\" (UID: \"49fd39c2-5908-43b7-9079-8c696fe2d198\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" Apr 20 21:56:37.607623 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.607595 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f757t\" (UniqueName: \"kubernetes.io/projected/49fd39c2-5908-43b7-9079-8c696fe2d198-kube-api-access-f757t\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z\" (UID: \"49fd39c2-5908-43b7-9079-8c696fe2d198\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" Apr 20 21:56:37.607757 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.607643 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/49fd39c2-5908-43b7-9079-8c696fe2d198-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z\" (UID: \"49fd39c2-5908-43b7-9079-8c696fe2d198\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" Apr 20 21:56:37.607757 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.607664 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/49fd39c2-5908-43b7-9079-8c696fe2d198-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z\" (UID: \"49fd39c2-5908-43b7-9079-8c696fe2d198\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" Apr 20 21:56:37.607954 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.607934 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/49fd39c2-5908-43b7-9079-8c696fe2d198-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z\" (UID: \"49fd39c2-5908-43b7-9079-8c696fe2d198\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" Apr 20 21:56:37.607991 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.607959 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/49fd39c2-5908-43b7-9079-8c696fe2d198-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z\" (UID: \"49fd39c2-5908-43b7-9079-8c696fe2d198\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" Apr 20 21:56:37.615736 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.615716 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f757t\" (UniqueName: \"kubernetes.io/projected/49fd39c2-5908-43b7-9079-8c696fe2d198-kube-api-access-f757t\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z\" (UID: \"49fd39c2-5908-43b7-9079-8c696fe2d198\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" Apr 20 21:56:37.620709 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.620684 2566 generic.go:358] "Generic (PLEG): container finished" podID="09a177d4-c3f7-4071-9617-cab492afb928" containerID="a3f68658343a7f00bdebbb6c80ce11e8cad926b2798cd16c058539eaf27cb73e" exitCode=0 Apr 20 21:56:37.620796 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.620770 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" event={"ID":"09a177d4-c3f7-4071-9617-cab492afb928","Type":"ContainerDied","Data":"a3f68658343a7f00bdebbb6c80ce11e8cad926b2798cd16c058539eaf27cb73e"} Apr 20 21:56:37.620833 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.620803 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" event={"ID":"09a177d4-c3f7-4071-9617-cab492afb928","Type":"ContainerStarted","Data":"f46e162dc39eeda1ca624402eb6dc75f967b686395f5181c3841d6473c74ac6f"} Apr 20 21:56:37.728830 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.728807 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj"] Apr 20 21:56:37.730547 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:56:37.730517 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea440a99_84be_442d_a9c0_eb81422af518.slice/crio-3278e9668b634209ad43c430ae56f2127188c8fc9db02608e0f4342d66aeaf15 WatchSource:0}: Error finding container 3278e9668b634209ad43c430ae56f2127188c8fc9db02608e0f4342d66aeaf15: Status 404 returned error can't find the container with id 3278e9668b634209ad43c430ae56f2127188c8fc9db02608e0f4342d66aeaf15 Apr 20 21:56:37.799408 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.799379 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" Apr 20 21:56:37.889363 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.889323 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc"] Apr 20 21:56:37.899513 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.897443 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" Apr 20 21:56:37.900294 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.900252 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc"] Apr 20 21:56:37.929274 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:37.929249 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z"] Apr 20 21:56:37.930429 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:56:37.930398 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49fd39c2_5908_43b7_9079_8c696fe2d198.slice/crio-84bb358af42c883d7d266e9d6eb3fb4ef8fd9f1478dba157a57a492c68713cff WatchSource:0}: Error finding container 84bb358af42c883d7d266e9d6eb3fb4ef8fd9f1478dba157a57a492c68713cff: Status 404 returned error can't find the container with id 84bb358af42c883d7d266e9d6eb3fb4ef8fd9f1478dba157a57a492c68713cff Apr 20 21:56:38.010367 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.010326 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8kqd\" (UniqueName: \"kubernetes.io/projected/a245ea99-e813-4fb4-b96c-eb5f7851d73f-kube-api-access-c8kqd\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc\" (UID: \"a245ea99-e813-4fb4-b96c-eb5f7851d73f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" Apr 20 21:56:38.010550 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.010390 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a245ea99-e813-4fb4-b96c-eb5f7851d73f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc\" (UID: \"a245ea99-e813-4fb4-b96c-eb5f7851d73f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" Apr 20 21:56:38.010550 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.010425 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a245ea99-e813-4fb4-b96c-eb5f7851d73f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc\" (UID: \"a245ea99-e813-4fb4-b96c-eb5f7851d73f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" Apr 20 21:56:38.110957 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.110932 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8kqd\" (UniqueName: \"kubernetes.io/projected/a245ea99-e813-4fb4-b96c-eb5f7851d73f-kube-api-access-c8kqd\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc\" (UID: \"a245ea99-e813-4fb4-b96c-eb5f7851d73f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" Apr 20 21:56:38.111111 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.110985 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a245ea99-e813-4fb4-b96c-eb5f7851d73f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc\" (UID: \"a245ea99-e813-4fb4-b96c-eb5f7851d73f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" Apr 20 21:56:38.111111 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.111025 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a245ea99-e813-4fb4-b96c-eb5f7851d73f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc\" (UID: \"a245ea99-e813-4fb4-b96c-eb5f7851d73f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" Apr 20 21:56:38.111447 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.111428 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a245ea99-e813-4fb4-b96c-eb5f7851d73f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc\" (UID: \"a245ea99-e813-4fb4-b96c-eb5f7851d73f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" Apr 20 21:56:38.111492 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.111442 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a245ea99-e813-4fb4-b96c-eb5f7851d73f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc\" (UID: \"a245ea99-e813-4fb4-b96c-eb5f7851d73f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" Apr 20 21:56:38.119016 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.118994 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8kqd\" (UniqueName: \"kubernetes.io/projected/a245ea99-e813-4fb4-b96c-eb5f7851d73f-kube-api-access-c8kqd\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc\" (UID: \"a245ea99-e813-4fb4-b96c-eb5f7851d73f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" Apr 20 21:56:38.209975 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.209950 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" Apr 20 21:56:38.331966 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.331942 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc"] Apr 20 21:56:38.371829 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:56:38.371801 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda245ea99_e813_4fb4_b96c_eb5f7851d73f.slice/crio-ee499e251d892751f14b916cd4115d35e52236ea0e7eae0e9475b5369009ed12 WatchSource:0}: Error finding container ee499e251d892751f14b916cd4115d35e52236ea0e7eae0e9475b5369009ed12: Status 404 returned error can't find the container with id ee499e251d892751f14b916cd4115d35e52236ea0e7eae0e9475b5369009ed12 Apr 20 21:56:38.626594 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.626566 2566 generic.go:358] "Generic (PLEG): container finished" podID="09a177d4-c3f7-4071-9617-cab492afb928" containerID="ab04fb93951456f6a551033dae298f4ecfa94b4b8f987bd06b04854c3fd6623a" exitCode=0 Apr 20 21:56:38.626740 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.626632 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" event={"ID":"09a177d4-c3f7-4071-9617-cab492afb928","Type":"ContainerDied","Data":"ab04fb93951456f6a551033dae298f4ecfa94b4b8f987bd06b04854c3fd6623a"} Apr 20 21:56:38.628115 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.628084 2566 generic.go:358] "Generic (PLEG): container finished" podID="49fd39c2-5908-43b7-9079-8c696fe2d198" containerID="dfc881d9ce97d15df5d0bfe6371c2bf5519ed26147e1b803a5f4c87134a8afd8" exitCode=0 Apr 20 21:56:38.628226 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.628163 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" event={"ID":"49fd39c2-5908-43b7-9079-8c696fe2d198","Type":"ContainerDied","Data":"dfc881d9ce97d15df5d0bfe6371c2bf5519ed26147e1b803a5f4c87134a8afd8"} Apr 20 21:56:38.628226 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.628197 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" event={"ID":"49fd39c2-5908-43b7-9079-8c696fe2d198","Type":"ContainerStarted","Data":"84bb358af42c883d7d266e9d6eb3fb4ef8fd9f1478dba157a57a492c68713cff"} Apr 20 21:56:38.629914 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.629893 2566 generic.go:358] "Generic (PLEG): container finished" podID="a245ea99-e813-4fb4-b96c-eb5f7851d73f" containerID="17f81e2d782064fbacb2756fba1a1209a08127261904642b8760479af3dbd348" exitCode=0 Apr 20 21:56:38.630050 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.630006 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" event={"ID":"a245ea99-e813-4fb4-b96c-eb5f7851d73f","Type":"ContainerDied","Data":"17f81e2d782064fbacb2756fba1a1209a08127261904642b8760479af3dbd348"} Apr 20 21:56:38.630050 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.630031 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" event={"ID":"a245ea99-e813-4fb4-b96c-eb5f7851d73f","Type":"ContainerStarted","Data":"ee499e251d892751f14b916cd4115d35e52236ea0e7eae0e9475b5369009ed12"} Apr 20 21:56:38.631515 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.631493 2566 generic.go:358] "Generic (PLEG): container finished" podID="ea440a99-84be-442d-a9c0-eb81422af518" containerID="183e0be63177a72a8e9743128e5504bef79888d66553a70ee8b15ec5d345fd94" exitCode=0 Apr 20 21:56:38.631621 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.631544 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" event={"ID":"ea440a99-84be-442d-a9c0-eb81422af518","Type":"ContainerDied","Data":"183e0be63177a72a8e9743128e5504bef79888d66553a70ee8b15ec5d345fd94"} Apr 20 21:56:38.631621 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:38.631560 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" event={"ID":"ea440a99-84be-442d-a9c0-eb81422af518","Type":"ContainerStarted","Data":"3278e9668b634209ad43c430ae56f2127188c8fc9db02608e0f4342d66aeaf15"} Apr 20 21:56:39.637473 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:39.637444 2566 generic.go:358] "Generic (PLEG): container finished" podID="a245ea99-e813-4fb4-b96c-eb5f7851d73f" containerID="e33c142c1951407eebd4d20ce22481fa71c9695826ae5a2a360a1464a1bf36cf" exitCode=0 Apr 20 21:56:39.637888 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:39.637531 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" event={"ID":"a245ea99-e813-4fb4-b96c-eb5f7851d73f","Type":"ContainerDied","Data":"e33c142c1951407eebd4d20ce22481fa71c9695826ae5a2a360a1464a1bf36cf"} Apr 20 21:56:39.639140 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:39.639113 2566 generic.go:358] "Generic (PLEG): container finished" podID="ea440a99-84be-442d-a9c0-eb81422af518" containerID="78f64fd964db8ddcf36e8eef4f0f1b71fafcc4a3743c6ba6e427d8165edcbbf8" exitCode=0 Apr 20 21:56:39.639251 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:39.639216 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" event={"ID":"ea440a99-84be-442d-a9c0-eb81422af518","Type":"ContainerDied","Data":"78f64fd964db8ddcf36e8eef4f0f1b71fafcc4a3743c6ba6e427d8165edcbbf8"} Apr 20 21:56:39.641378 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:39.641358 2566 generic.go:358] "Generic (PLEG): container finished" podID="09a177d4-c3f7-4071-9617-cab492afb928" containerID="afb7ea84abd09fd79bcbf7410606d05bef5d699dd00e569212f06b1c19497153" exitCode=0 Apr 20 21:56:39.641473 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:39.641421 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" event={"ID":"09a177d4-c3f7-4071-9617-cab492afb928","Type":"ContainerDied","Data":"afb7ea84abd09fd79bcbf7410606d05bef5d699dd00e569212f06b1c19497153"} Apr 20 21:56:39.643014 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:39.642996 2566 generic.go:358] "Generic (PLEG): container finished" podID="49fd39c2-5908-43b7-9079-8c696fe2d198" containerID="8ce4318c8ea849f690f18e9ed1500caeaeed29fe10b6d61ad9bd8b10c6d57b06" exitCode=0 Apr 20 21:56:39.643144 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:39.643047 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" event={"ID":"49fd39c2-5908-43b7-9079-8c696fe2d198","Type":"ContainerDied","Data":"8ce4318c8ea849f690f18e9ed1500caeaeed29fe10b6d61ad9bd8b10c6d57b06"} Apr 20 21:56:40.649050 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.649016 2566 generic.go:358] "Generic (PLEG): container finished" podID="a245ea99-e813-4fb4-b96c-eb5f7851d73f" containerID="f0665c263a69cf04896b1999bef11474ff06725f061dcf33cd64d114d6c34f6a" exitCode=0 Apr 20 21:56:40.649488 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.649100 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" event={"ID":"a245ea99-e813-4fb4-b96c-eb5f7851d73f","Type":"ContainerDied","Data":"f0665c263a69cf04896b1999bef11474ff06725f061dcf33cd64d114d6c34f6a"} Apr 20 21:56:40.654024 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.653998 2566 generic.go:358] "Generic (PLEG): container finished" podID="ea440a99-84be-442d-a9c0-eb81422af518" containerID="d9438ddccc792aae52e68855026139c860876278e11ea3cb77ec3e6b8556fe1f" exitCode=0 Apr 20 21:56:40.654133 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.654083 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" event={"ID":"ea440a99-84be-442d-a9c0-eb81422af518","Type":"ContainerDied","Data":"d9438ddccc792aae52e68855026139c860876278e11ea3cb77ec3e6b8556fe1f"} Apr 20 21:56:40.655955 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.655928 2566 generic.go:358] "Generic (PLEG): container finished" podID="49fd39c2-5908-43b7-9079-8c696fe2d198" containerID="563795ced58be802aac26a8a1098809a2d38744ff5f20497b84d6e2bd78eb8d6" exitCode=0 Apr 20 21:56:40.656099 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.656004 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" event={"ID":"49fd39c2-5908-43b7-9079-8c696fe2d198","Type":"ContainerDied","Data":"563795ced58be802aac26a8a1098809a2d38744ff5f20497b84d6e2bd78eb8d6"} Apr 20 21:56:40.786059 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.786034 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" Apr 20 21:56:40.833617 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.833592 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09a177d4-c3f7-4071-9617-cab492afb928-bundle\") pod \"09a177d4-c3f7-4071-9617-cab492afb928\" (UID: \"09a177d4-c3f7-4071-9617-cab492afb928\") " Apr 20 21:56:40.833778 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.833651 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm62p\" (UniqueName: \"kubernetes.io/projected/09a177d4-c3f7-4071-9617-cab492afb928-kube-api-access-pm62p\") pod \"09a177d4-c3f7-4071-9617-cab492afb928\" (UID: \"09a177d4-c3f7-4071-9617-cab492afb928\") " Apr 20 21:56:40.833778 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.833700 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09a177d4-c3f7-4071-9617-cab492afb928-util\") pod \"09a177d4-c3f7-4071-9617-cab492afb928\" (UID: \"09a177d4-c3f7-4071-9617-cab492afb928\") " Apr 20 21:56:40.834132 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.834099 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a177d4-c3f7-4071-9617-cab492afb928-bundle" (OuterVolumeSpecName: "bundle") pod "09a177d4-c3f7-4071-9617-cab492afb928" (UID: "09a177d4-c3f7-4071-9617-cab492afb928"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:56:40.835922 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.835893 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a177d4-c3f7-4071-9617-cab492afb928-kube-api-access-pm62p" (OuterVolumeSpecName: "kube-api-access-pm62p") pod "09a177d4-c3f7-4071-9617-cab492afb928" (UID: "09a177d4-c3f7-4071-9617-cab492afb928"). InnerVolumeSpecName "kube-api-access-pm62p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:56:40.839113 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.839091 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a177d4-c3f7-4071-9617-cab492afb928-util" (OuterVolumeSpecName: "util") pod "09a177d4-c3f7-4071-9617-cab492afb928" (UID: "09a177d4-c3f7-4071-9617-cab492afb928"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:56:40.934535 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.934474 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09a177d4-c3f7-4071-9617-cab492afb928-util\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:56:40.934535 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.934498 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09a177d4-c3f7-4071-9617-cab492afb928-bundle\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:56:40.934535 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:40.934507 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pm62p\" (UniqueName: \"kubernetes.io/projected/09a177d4-c3f7-4071-9617-cab492afb928-kube-api-access-pm62p\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:56:41.662308 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.662266 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" Apr 20 21:56:41.662308 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.662250 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr" event={"ID":"09a177d4-c3f7-4071-9617-cab492afb928","Type":"ContainerDied","Data":"f46e162dc39eeda1ca624402eb6dc75f967b686395f5181c3841d6473c74ac6f"} Apr 20 21:56:41.662778 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.662323 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f46e162dc39eeda1ca624402eb6dc75f967b686395f5181c3841d6473c74ac6f" Apr 20 21:56:41.820677 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.820645 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" Apr 20 21:56:41.825417 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.825269 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" Apr 20 21:56:41.842459 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.842429 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/49fd39c2-5908-43b7-9079-8c696fe2d198-bundle\") pod \"49fd39c2-5908-43b7-9079-8c696fe2d198\" (UID: \"49fd39c2-5908-43b7-9079-8c696fe2d198\") " Apr 20 21:56:41.842658 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.842636 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f757t\" (UniqueName: \"kubernetes.io/projected/49fd39c2-5908-43b7-9079-8c696fe2d198-kube-api-access-f757t\") pod \"49fd39c2-5908-43b7-9079-8c696fe2d198\" (UID: \"49fd39c2-5908-43b7-9079-8c696fe2d198\") " Apr 20 21:56:41.842768 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.842689 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/49fd39c2-5908-43b7-9079-8c696fe2d198-util\") pod \"49fd39c2-5908-43b7-9079-8c696fe2d198\" (UID: \"49fd39c2-5908-43b7-9079-8c696fe2d198\") " Apr 20 21:56:41.843160 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.843133 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49fd39c2-5908-43b7-9079-8c696fe2d198-bundle" (OuterVolumeSpecName: "bundle") pod "49fd39c2-5908-43b7-9079-8c696fe2d198" (UID: "49fd39c2-5908-43b7-9079-8c696fe2d198"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:56:41.844916 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.844883 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49fd39c2-5908-43b7-9079-8c696fe2d198-kube-api-access-f757t" (OuterVolumeSpecName: "kube-api-access-f757t") pod "49fd39c2-5908-43b7-9079-8c696fe2d198" (UID: "49fd39c2-5908-43b7-9079-8c696fe2d198"). InnerVolumeSpecName "kube-api-access-f757t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:56:41.848532 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.848505 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49fd39c2-5908-43b7-9079-8c696fe2d198-util" (OuterVolumeSpecName: "util") pod "49fd39c2-5908-43b7-9079-8c696fe2d198" (UID: "49fd39c2-5908-43b7-9079-8c696fe2d198"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:56:41.858772 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.858750 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" Apr 20 21:56:41.943581 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.943507 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a245ea99-e813-4fb4-b96c-eb5f7851d73f-bundle\") pod \"a245ea99-e813-4fb4-b96c-eb5f7851d73f\" (UID: \"a245ea99-e813-4fb4-b96c-eb5f7851d73f\") " Apr 20 21:56:41.943581 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.943569 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea440a99-84be-442d-a9c0-eb81422af518-bundle\") pod \"ea440a99-84be-442d-a9c0-eb81422af518\" (UID: \"ea440a99-84be-442d-a9c0-eb81422af518\") " Apr 20 21:56:41.943792 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.943589 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8kqd\" (UniqueName: \"kubernetes.io/projected/a245ea99-e813-4fb4-b96c-eb5f7851d73f-kube-api-access-c8kqd\") pod \"a245ea99-e813-4fb4-b96c-eb5f7851d73f\" (UID: \"a245ea99-e813-4fb4-b96c-eb5f7851d73f\") " Apr 20 21:56:41.943792 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.943623 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a245ea99-e813-4fb4-b96c-eb5f7851d73f-util\") pod \"a245ea99-e813-4fb4-b96c-eb5f7851d73f\" (UID: \"a245ea99-e813-4fb4-b96c-eb5f7851d73f\") " Apr 20 21:56:41.943792 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.943658 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbr57\" (UniqueName: \"kubernetes.io/projected/ea440a99-84be-442d-a9c0-eb81422af518-kube-api-access-lbr57\") pod \"ea440a99-84be-442d-a9c0-eb81422af518\" (UID: \"ea440a99-84be-442d-a9c0-eb81422af518\") " Apr 20 21:56:41.943792 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.943690 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea440a99-84be-442d-a9c0-eb81422af518-util\") pod \"ea440a99-84be-442d-a9c0-eb81422af518\" (UID: \"ea440a99-84be-442d-a9c0-eb81422af518\") " Apr 20 21:56:41.943990 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.943905 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f757t\" (UniqueName: \"kubernetes.io/projected/49fd39c2-5908-43b7-9079-8c696fe2d198-kube-api-access-f757t\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:56:41.943990 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.943926 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/49fd39c2-5908-43b7-9079-8c696fe2d198-util\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:56:41.943990 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.943940 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/49fd39c2-5908-43b7-9079-8c696fe2d198-bundle\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:56:41.944135 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.944029 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a245ea99-e813-4fb4-b96c-eb5f7851d73f-bundle" (OuterVolumeSpecName: "bundle") pod "a245ea99-e813-4fb4-b96c-eb5f7851d73f" (UID: "a245ea99-e813-4fb4-b96c-eb5f7851d73f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:56:41.944420 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.944375 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea440a99-84be-442d-a9c0-eb81422af518-bundle" (OuterVolumeSpecName: "bundle") pod "ea440a99-84be-442d-a9c0-eb81422af518" (UID: "ea440a99-84be-442d-a9c0-eb81422af518"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:56:41.945756 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.945730 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a245ea99-e813-4fb4-b96c-eb5f7851d73f-kube-api-access-c8kqd" (OuterVolumeSpecName: "kube-api-access-c8kqd") pod "a245ea99-e813-4fb4-b96c-eb5f7851d73f" (UID: "a245ea99-e813-4fb4-b96c-eb5f7851d73f"). InnerVolumeSpecName "kube-api-access-c8kqd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:56:41.945842 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.945781 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea440a99-84be-442d-a9c0-eb81422af518-kube-api-access-lbr57" (OuterVolumeSpecName: "kube-api-access-lbr57") pod "ea440a99-84be-442d-a9c0-eb81422af518" (UID: "ea440a99-84be-442d-a9c0-eb81422af518"). InnerVolumeSpecName "kube-api-access-lbr57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:56:41.948699 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.948660 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea440a99-84be-442d-a9c0-eb81422af518-util" (OuterVolumeSpecName: "util") pod "ea440a99-84be-442d-a9c0-eb81422af518" (UID: "ea440a99-84be-442d-a9c0-eb81422af518"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:56:41.949360 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:41.949342 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a245ea99-e813-4fb4-b96c-eb5f7851d73f-util" (OuterVolumeSpecName: "util") pod "a245ea99-e813-4fb4-b96c-eb5f7851d73f" (UID: "a245ea99-e813-4fb4-b96c-eb5f7851d73f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:56:42.044899 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:42.044878 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a245ea99-e813-4fb4-b96c-eb5f7851d73f-bundle\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:56:42.044899 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:42.044899 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea440a99-84be-442d-a9c0-eb81422af518-bundle\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:56:42.045052 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:42.044908 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c8kqd\" (UniqueName: \"kubernetes.io/projected/a245ea99-e813-4fb4-b96c-eb5f7851d73f-kube-api-access-c8kqd\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:56:42.045052 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:42.044916 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a245ea99-e813-4fb4-b96c-eb5f7851d73f-util\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:56:42.045052 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:42.044925 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lbr57\" (UniqueName: \"kubernetes.io/projected/ea440a99-84be-442d-a9c0-eb81422af518-kube-api-access-lbr57\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:56:42.045052 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:42.044933 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea440a99-84be-442d-a9c0-eb81422af518-util\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:56:42.667827 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:42.667793 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" event={"ID":"a245ea99-e813-4fb4-b96c-eb5f7851d73f","Type":"ContainerDied","Data":"ee499e251d892751f14b916cd4115d35e52236ea0e7eae0e9475b5369009ed12"} Apr 20 21:56:42.667827 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:42.667833 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee499e251d892751f14b916cd4115d35e52236ea0e7eae0e9475b5369009ed12" Apr 20 21:56:42.668405 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:42.667812 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc" Apr 20 21:56:42.669571 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:42.669548 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" Apr 20 21:56:42.669698 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:42.669566 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj" event={"ID":"ea440a99-84be-442d-a9c0-eb81422af518","Type":"ContainerDied","Data":"3278e9668b634209ad43c430ae56f2127188c8fc9db02608e0f4342d66aeaf15"} Apr 20 21:56:42.669698 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:42.669596 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3278e9668b634209ad43c430ae56f2127188c8fc9db02608e0f4342d66aeaf15" Apr 20 21:56:42.671300 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:42.671261 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" event={"ID":"49fd39c2-5908-43b7-9079-8c696fe2d198","Type":"ContainerDied","Data":"84bb358af42c883d7d266e9d6eb3fb4ef8fd9f1478dba157a57a492c68713cff"} Apr 20 21:56:42.671404 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:42.671318 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84bb358af42c883d7d266e9d6eb3fb4ef8fd9f1478dba157a57a492c68713cff" Apr 20 21:56:42.671404 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:42.671271 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z" Apr 20 21:56:44.156048 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156004 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-844ff484b4-jvpzl"] Apr 20 21:56:44.156519 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156427 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a245ea99-e813-4fb4-b96c-eb5f7851d73f" containerName="util" Apr 20 21:56:44.156519 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156439 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a245ea99-e813-4fb4-b96c-eb5f7851d73f" containerName="util" Apr 20 21:56:44.156519 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156448 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a245ea99-e813-4fb4-b96c-eb5f7851d73f" containerName="extract" Apr 20 21:56:44.156519 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156456 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a245ea99-e813-4fb4-b96c-eb5f7851d73f" containerName="extract" Apr 20 21:56:44.156519 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156467 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09a177d4-c3f7-4071-9617-cab492afb928" containerName="util" Apr 20 21:56:44.156519 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156473 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a177d4-c3f7-4071-9617-cab492afb928" containerName="util" Apr 20 21:56:44.156519 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156482 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea440a99-84be-442d-a9c0-eb81422af518" containerName="util" Apr 20 21:56:44.156519 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156487 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea440a99-84be-442d-a9c0-eb81422af518" containerName="util" Apr 20 21:56:44.156519 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156501 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09a177d4-c3f7-4071-9617-cab492afb928" containerName="pull" Apr 20 21:56:44.156519 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156506 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a177d4-c3f7-4071-9617-cab492afb928" containerName="pull" Apr 20 21:56:44.156519 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156511 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea440a99-84be-442d-a9c0-eb81422af518" containerName="pull" Apr 20 21:56:44.156519 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156516 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea440a99-84be-442d-a9c0-eb81422af518" containerName="pull" Apr 20 21:56:44.156519 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156523 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49fd39c2-5908-43b7-9079-8c696fe2d198" containerName="pull" Apr 20 21:56:44.157365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156530 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fd39c2-5908-43b7-9079-8c696fe2d198" containerName="pull" Apr 20 21:56:44.157365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156538 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49fd39c2-5908-43b7-9079-8c696fe2d198" containerName="extract" Apr 20 21:56:44.157365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156546 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fd39c2-5908-43b7-9079-8c696fe2d198" containerName="extract" Apr 20 21:56:44.157365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156553 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49fd39c2-5908-43b7-9079-8c696fe2d198" containerName="util" Apr 20 21:56:44.157365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156559 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fd39c2-5908-43b7-9079-8c696fe2d198" containerName="util" Apr 20 21:56:44.157365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156567 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09a177d4-c3f7-4071-9617-cab492afb928" containerName="extract" Apr 20 21:56:44.157365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156571 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a177d4-c3f7-4071-9617-cab492afb928" containerName="extract" Apr 20 21:56:44.157365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156578 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a245ea99-e813-4fb4-b96c-eb5f7851d73f" containerName="pull" Apr 20 21:56:44.157365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156583 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a245ea99-e813-4fb4-b96c-eb5f7851d73f" containerName="pull" Apr 20 21:56:44.157365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156589 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea440a99-84be-442d-a9c0-eb81422af518" containerName="extract" Apr 20 21:56:44.157365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156594 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea440a99-84be-442d-a9c0-eb81422af518" containerName="extract" Apr 20 21:56:44.157365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156653 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="49fd39c2-5908-43b7-9079-8c696fe2d198" containerName="extract" Apr 20 21:56:44.157365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156662 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea440a99-84be-442d-a9c0-eb81422af518" containerName="extract" Apr 20 21:56:44.157365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156669 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="a245ea99-e813-4fb4-b96c-eb5f7851d73f" containerName="extract" Apr 20 21:56:44.157365 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.156676 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="09a177d4-c3f7-4071-9617-cab492afb928" containerName="extract" Apr 20 21:56:44.161338 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.161315 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.173003 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.172979 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-844ff484b4-jvpzl"] Apr 20 21:56:44.265295 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.265258 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1bf2012b-ff7f-450c-beca-96f972eb0894-console-config\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.265469 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.265334 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1bf2012b-ff7f-450c-beca-96f972eb0894-service-ca\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.265469 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.265362 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf2012b-ff7f-450c-beca-96f972eb0894-trusted-ca-bundle\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.265469 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.265395 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf2012b-ff7f-450c-beca-96f972eb0894-console-serving-cert\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.265641 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.265469 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1bf2012b-ff7f-450c-beca-96f972eb0894-oauth-serving-cert\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.265641 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.265525 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1bf2012b-ff7f-450c-beca-96f972eb0894-console-oauth-config\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.265641 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.265555 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6dm9\" (UniqueName: \"kubernetes.io/projected/1bf2012b-ff7f-450c-beca-96f972eb0894-kube-api-access-n6dm9\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.366731 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.366697 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1bf2012b-ff7f-450c-beca-96f972eb0894-service-ca\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.366731 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.366739 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf2012b-ff7f-450c-beca-96f972eb0894-trusted-ca-bundle\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.366975 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.366873 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf2012b-ff7f-450c-beca-96f972eb0894-console-serving-cert\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.366975 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.366925 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1bf2012b-ff7f-450c-beca-96f972eb0894-oauth-serving-cert\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.367085 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.366982 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1bf2012b-ff7f-450c-beca-96f972eb0894-console-oauth-config\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.367085 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.367011 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6dm9\" (UniqueName: \"kubernetes.io/projected/1bf2012b-ff7f-450c-beca-96f972eb0894-kube-api-access-n6dm9\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.367420 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.367083 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1bf2012b-ff7f-450c-beca-96f972eb0894-console-config\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.367547 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.367515 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1bf2012b-ff7f-450c-beca-96f972eb0894-service-ca\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.367669 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.367632 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf2012b-ff7f-450c-beca-96f972eb0894-trusted-ca-bundle\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.367725 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.367691 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1bf2012b-ff7f-450c-beca-96f972eb0894-console-config\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.367815 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.367788 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1bf2012b-ff7f-450c-beca-96f972eb0894-oauth-serving-cert\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.369317 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.369295 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf2012b-ff7f-450c-beca-96f972eb0894-console-serving-cert\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.369408 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.369317 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1bf2012b-ff7f-450c-beca-96f972eb0894-console-oauth-config\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.376238 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.376217 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6dm9\" (UniqueName: \"kubernetes.io/projected/1bf2012b-ff7f-450c-beca-96f972eb0894-kube-api-access-n6dm9\") pod \"console-844ff484b4-jvpzl\" (UID: \"1bf2012b-ff7f-450c-beca-96f972eb0894\") " pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.473885 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.473856 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:44.594913 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.594890 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-844ff484b4-jvpzl"] Apr 20 21:56:44.596265 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:56:44.596238 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bf2012b_ff7f_450c_beca_96f972eb0894.slice/crio-af92a4e988466d802fbc46cab2e03bbc9363356bc415a679d02f634bbd1af696 WatchSource:0}: Error finding container af92a4e988466d802fbc46cab2e03bbc9363356bc415a679d02f634bbd1af696: Status 404 returned error can't find the container with id af92a4e988466d802fbc46cab2e03bbc9363356bc415a679d02f634bbd1af696 Apr 20 21:56:44.681534 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.681495 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-844ff484b4-jvpzl" event={"ID":"1bf2012b-ff7f-450c-beca-96f972eb0894","Type":"ContainerStarted","Data":"87cd42fb94cfcd992640b89d6118a534780298348e5851b7e38700002e577a58"} Apr 20 21:56:44.681534 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.681537 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-844ff484b4-jvpzl" event={"ID":"1bf2012b-ff7f-450c-beca-96f972eb0894","Type":"ContainerStarted","Data":"af92a4e988466d802fbc46cab2e03bbc9363356bc415a679d02f634bbd1af696"} Apr 20 21:56:44.699215 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:44.699145 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-844ff484b4-jvpzl" podStartSLOduration=0.699125411 podStartE2EDuration="699.125411ms" podCreationTimestamp="2026-04-20 21:56:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:56:44.698650008 +0000 UTC m=+573.507337373" watchObservedRunningTime="2026-04-20 21:56:44.699125411 +0000 UTC m=+573.507812782" Apr 20 21:56:51.555345 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:51.555304 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-8dnwh"] Apr 20 21:56:51.559393 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:51.559369 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-8dnwh" Apr 20 21:56:51.562242 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:51.562219 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-bnfrd\"" Apr 20 21:56:51.562935 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:51.562918 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 21:56:51.563583 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:51.563565 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 21:56:51.570424 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:51.570401 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-8dnwh"] Apr 20 21:56:51.625513 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:51.625483 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phb77\" (UniqueName: \"kubernetes.io/projected/2d22820f-e38b-4777-bd21-da858275f747-kube-api-access-phb77\") pod \"authorino-operator-657f44b778-8dnwh\" (UID: \"2d22820f-e38b-4777-bd21-da858275f747\") " pod="kuadrant-system/authorino-operator-657f44b778-8dnwh" Apr 20 21:56:51.726208 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:51.726169 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phb77\" (UniqueName: \"kubernetes.io/projected/2d22820f-e38b-4777-bd21-da858275f747-kube-api-access-phb77\") pod \"authorino-operator-657f44b778-8dnwh\" (UID: \"2d22820f-e38b-4777-bd21-da858275f747\") " pod="kuadrant-system/authorino-operator-657f44b778-8dnwh" Apr 20 21:56:51.737433 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:51.737408 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phb77\" (UniqueName: \"kubernetes.io/projected/2d22820f-e38b-4777-bd21-da858275f747-kube-api-access-phb77\") pod \"authorino-operator-657f44b778-8dnwh\" (UID: \"2d22820f-e38b-4777-bd21-da858275f747\") " pod="kuadrant-system/authorino-operator-657f44b778-8dnwh" Apr 20 21:56:51.871235 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:51.871168 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-8dnwh" Apr 20 21:56:51.993926 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:51.993903 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-8dnwh"] Apr 20 21:56:51.995947 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:56:51.995920 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d22820f_e38b_4777_bd21_da858275f747.slice/crio-04440e6d8c9ab9ac174f2946ecbbe85650d7ffc57f89f2a6d003cf9541e6324c WatchSource:0}: Error finding container 04440e6d8c9ab9ac174f2946ecbbe85650d7ffc57f89f2a6d003cf9541e6324c: Status 404 returned error can't find the container with id 04440e6d8c9ab9ac174f2946ecbbe85650d7ffc57f89f2a6d003cf9541e6324c Apr 20 21:56:52.716552 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:52.716518 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-8dnwh" event={"ID":"2d22820f-e38b-4777-bd21-da858275f747","Type":"ContainerStarted","Data":"04440e6d8c9ab9ac174f2946ecbbe85650d7ffc57f89f2a6d003cf9541e6324c"} Apr 20 21:56:54.474759 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:54.474726 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:54.475172 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:54.474788 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:54.479542 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:54.479517 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:54.724900 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:54.724812 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-8dnwh" event={"ID":"2d22820f-e38b-4777-bd21-da858275f747","Type":"ContainerStarted","Data":"b16818baf71aa469543976f271e063c26cca60a7da50c76fe7672a9fa5065e28"} Apr 20 21:56:54.725041 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:54.724936 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-8dnwh" Apr 20 21:56:54.728878 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:54.728860 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-844ff484b4-jvpzl" Apr 20 21:56:54.743536 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:54.743493 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-8dnwh" podStartSLOduration=1.979336402 podStartE2EDuration="3.743483195s" podCreationTimestamp="2026-04-20 21:56:51 +0000 UTC" firstStartedPulling="2026-04-20 21:56:51.998045023 +0000 UTC m=+580.806732366" lastFinishedPulling="2026-04-20 21:56:53.762191816 +0000 UTC m=+582.570879159" observedRunningTime="2026-04-20 21:56:54.740547971 +0000 UTC m=+583.549235337" watchObservedRunningTime="2026-04-20 21:56:54.743483195 +0000 UTC m=+583.552170595" Apr 20 21:56:54.785401 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:54.785373 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74f4fd8fbc-m7zxp"] Apr 20 21:56:56.224650 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:56.224621 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-zlgxv"] Apr 20 21:56:56.228184 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:56.228167 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zlgxv" Apr 20 21:56:56.230843 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:56.230818 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 21:56:56.230843 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:56.230818 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-htfgh\"" Apr 20 21:56:56.238208 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:56.238187 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-zlgxv"] Apr 20 21:56:56.365600 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:56.365572 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jgps\" (UniqueName: \"kubernetes.io/projected/699ac8a1-5fe5-49ad-acb8-67aaae92caea-kube-api-access-4jgps\") pod \"dns-operator-controller-manager-648d5c98bc-zlgxv\" (UID: \"699ac8a1-5fe5-49ad-acb8-67aaae92caea\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zlgxv" Apr 20 21:56:56.466921 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:56.466891 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jgps\" (UniqueName: \"kubernetes.io/projected/699ac8a1-5fe5-49ad-acb8-67aaae92caea-kube-api-access-4jgps\") pod \"dns-operator-controller-manager-648d5c98bc-zlgxv\" (UID: \"699ac8a1-5fe5-49ad-acb8-67aaae92caea\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zlgxv" Apr 20 21:56:56.475060 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:56.475008 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jgps\" (UniqueName: \"kubernetes.io/projected/699ac8a1-5fe5-49ad-acb8-67aaae92caea-kube-api-access-4jgps\") pod \"dns-operator-controller-manager-648d5c98bc-zlgxv\" (UID: \"699ac8a1-5fe5-49ad-acb8-67aaae92caea\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zlgxv" Apr 20 21:56:56.538847 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:56.538812 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zlgxv" Apr 20 21:56:56.679132 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:56.679103 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-zlgxv"] Apr 20 21:56:56.680513 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:56:56.680487 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod699ac8a1_5fe5_49ad_acb8_67aaae92caea.slice/crio-e52514fb400b14b86d9c88db1fee1009f830aa80bc33af39808be85c897a4d8c WatchSource:0}: Error finding container e52514fb400b14b86d9c88db1fee1009f830aa80bc33af39808be85c897a4d8c: Status 404 returned error can't find the container with id e52514fb400b14b86d9c88db1fee1009f830aa80bc33af39808be85c897a4d8c Apr 20 21:56:56.741993 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:56.741916 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zlgxv" event={"ID":"699ac8a1-5fe5-49ad-acb8-67aaae92caea","Type":"ContainerStarted","Data":"e52514fb400b14b86d9c88db1fee1009f830aa80bc33af39808be85c897a4d8c"} Apr 20 21:56:59.760418 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:59.760381 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zlgxv" event={"ID":"699ac8a1-5fe5-49ad-acb8-67aaae92caea","Type":"ContainerStarted","Data":"0234356edc513622bf81ced48dba1e4ad9095721f72fe8e8c47b8d63f0d07b4e"} Apr 20 21:56:59.760780 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:59.760509 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zlgxv" Apr 20 21:56:59.779683 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:56:59.779630 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zlgxv" podStartSLOduration=1.305396475 podStartE2EDuration="3.779612795s" podCreationTimestamp="2026-04-20 21:56:56 +0000 UTC" firstStartedPulling="2026-04-20 21:56:56.682783666 +0000 UTC m=+585.491471009" lastFinishedPulling="2026-04-20 21:56:59.156999976 +0000 UTC m=+587.965687329" observedRunningTime="2026-04-20 21:56:59.776223555 +0000 UTC m=+588.584910920" watchObservedRunningTime="2026-04-20 21:56:59.779612795 +0000 UTC m=+588.588300161" Apr 20 21:57:05.733964 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:05.733935 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-8dnwh" Apr 20 21:57:07.700325 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:07.700275 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj"] Apr 20 21:57:07.703868 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:07.703845 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" Apr 20 21:57:07.707874 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:07.707856 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-htr2p\"" Apr 20 21:57:07.717967 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:07.717945 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj"] Apr 20 21:57:07.864493 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:07.864464 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldvg6\" (UniqueName: \"kubernetes.io/projected/a08ad341-1cc2-40b8-a166-d96dbe06bcc6-kube-api-access-ldvg6\") pod \"kuadrant-operator-controller-manager-55c7f4c975-q5ftj\" (UID: \"a08ad341-1cc2-40b8-a166-d96dbe06bcc6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" Apr 20 21:57:07.864660 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:07.864515 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a08ad341-1cc2-40b8-a166-d96dbe06bcc6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-q5ftj\" (UID: \"a08ad341-1cc2-40b8-a166-d96dbe06bcc6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" Apr 20 21:57:07.965939 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:07.965913 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a08ad341-1cc2-40b8-a166-d96dbe06bcc6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-q5ftj\" (UID: \"a08ad341-1cc2-40b8-a166-d96dbe06bcc6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" Apr 20 21:57:07.966125 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:07.966001 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldvg6\" (UniqueName: \"kubernetes.io/projected/a08ad341-1cc2-40b8-a166-d96dbe06bcc6-kube-api-access-ldvg6\") pod \"kuadrant-operator-controller-manager-55c7f4c975-q5ftj\" (UID: \"a08ad341-1cc2-40b8-a166-d96dbe06bcc6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" Apr 20 21:57:07.966359 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:07.966336 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a08ad341-1cc2-40b8-a166-d96dbe06bcc6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-q5ftj\" (UID: \"a08ad341-1cc2-40b8-a166-d96dbe06bcc6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" Apr 20 21:57:07.982898 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:07.982876 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldvg6\" (UniqueName: \"kubernetes.io/projected/a08ad341-1cc2-40b8-a166-d96dbe06bcc6-kube-api-access-ldvg6\") pod \"kuadrant-operator-controller-manager-55c7f4c975-q5ftj\" (UID: \"a08ad341-1cc2-40b8-a166-d96dbe06bcc6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" Apr 20 21:57:08.014973 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:08.014952 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" Apr 20 21:57:08.139638 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:08.139611 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj"] Apr 20 21:57:08.142236 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:57:08.142206 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda08ad341_1cc2_40b8_a166_d96dbe06bcc6.slice/crio-f549338eb955f528a495fd4359cb74618b95e12e2170a6438c253d19a65827a5 WatchSource:0}: Error finding container f549338eb955f528a495fd4359cb74618b95e12e2170a6438c253d19a65827a5: Status 404 returned error can't find the container with id f549338eb955f528a495fd4359cb74618b95e12e2170a6438c253d19a65827a5 Apr 20 21:57:08.805177 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:08.805135 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" event={"ID":"a08ad341-1cc2-40b8-a166-d96dbe06bcc6","Type":"ContainerStarted","Data":"f549338eb955f528a495fd4359cb74618b95e12e2170a6438c253d19a65827a5"} Apr 20 21:57:10.766835 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:10.766801 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zlgxv" Apr 20 21:57:11.806007 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:11.805981 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 21:57:11.806406 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:11.805982 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 21:57:12.822562 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:12.822524 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" event={"ID":"a08ad341-1cc2-40b8-a166-d96dbe06bcc6","Type":"ContainerStarted","Data":"04bc63e01756a28472473a87904245568b82f576a7ce0690c2064e29f28f948f"} Apr 20 21:57:12.823014 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:12.822642 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" Apr 20 21:57:12.844725 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:12.844678 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" podStartSLOduration=2.152106967 podStartE2EDuration="5.844665586s" podCreationTimestamp="2026-04-20 21:57:07 +0000 UTC" firstStartedPulling="2026-04-20 21:57:08.144385977 +0000 UTC m=+596.953073320" lastFinishedPulling="2026-04-20 21:57:11.836944594 +0000 UTC m=+600.645631939" observedRunningTime="2026-04-20 21:57:12.840965183 +0000 UTC m=+601.649652547" watchObservedRunningTime="2026-04-20 21:57:12.844665586 +0000 UTC m=+601.653352955" Apr 20 21:57:19.809711 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:19.809642 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74f4fd8fbc-m7zxp" podUID="abd53352-97ee-4fd9-86bf-f96dd62d3a92" containerName="console" containerID="cri-o://9591c4e4532a9965ce9cdb5b87b01d8420a486e98ab155d084d2de7fc8187723" gracePeriod=15 Apr 20 21:57:20.047473 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.047448 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74f4fd8fbc-m7zxp_abd53352-97ee-4fd9-86bf-f96dd62d3a92/console/0.log" Apr 20 21:57:20.047599 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.047518 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:57:20.170186 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.170105 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-service-ca\") pod \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " Apr 20 21:57:20.170186 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.170143 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-serving-cert\") pod \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " Apr 20 21:57:20.170186 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.170171 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-oauth-serving-cert\") pod \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " Apr 20 21:57:20.170501 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.170213 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-config\") pod \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " Apr 20 21:57:20.170501 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.170232 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf9ht\" (UniqueName: \"kubernetes.io/projected/abd53352-97ee-4fd9-86bf-f96dd62d3a92-kube-api-access-kf9ht\") pod \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " Apr 20 21:57:20.170501 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.170275 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-trusted-ca-bundle\") pod \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " Apr 20 21:57:20.170501 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.170321 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-oauth-config\") pod \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\" (UID: \"abd53352-97ee-4fd9-86bf-f96dd62d3a92\") " Apr 20 21:57:20.170707 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.170647 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-service-ca" (OuterVolumeSpecName: "service-ca") pod "abd53352-97ee-4fd9-86bf-f96dd62d3a92" (UID: "abd53352-97ee-4fd9-86bf-f96dd62d3a92"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:57:20.170758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.170695 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-config" (OuterVolumeSpecName: "console-config") pod "abd53352-97ee-4fd9-86bf-f96dd62d3a92" (UID: "abd53352-97ee-4fd9-86bf-f96dd62d3a92"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:57:20.170758 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.170702 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "abd53352-97ee-4fd9-86bf-f96dd62d3a92" (UID: "abd53352-97ee-4fd9-86bf-f96dd62d3a92"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:57:20.170847 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.170783 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "abd53352-97ee-4fd9-86bf-f96dd62d3a92" (UID: "abd53352-97ee-4fd9-86bf-f96dd62d3a92"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:57:20.172615 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.172591 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "abd53352-97ee-4fd9-86bf-f96dd62d3a92" (UID: "abd53352-97ee-4fd9-86bf-f96dd62d3a92"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:57:20.172755 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.172717 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "abd53352-97ee-4fd9-86bf-f96dd62d3a92" (UID: "abd53352-97ee-4fd9-86bf-f96dd62d3a92"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:57:20.172848 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.172830 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abd53352-97ee-4fd9-86bf-f96dd62d3a92-kube-api-access-kf9ht" (OuterVolumeSpecName: "kube-api-access-kf9ht") pod "abd53352-97ee-4fd9-86bf-f96dd62d3a92" (UID: "abd53352-97ee-4fd9-86bf-f96dd62d3a92"). InnerVolumeSpecName "kube-api-access-kf9ht". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:57:20.271025 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.270997 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-config\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:57:20.271025 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.271021 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kf9ht\" (UniqueName: \"kubernetes.io/projected/abd53352-97ee-4fd9-86bf-f96dd62d3a92-kube-api-access-kf9ht\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:57:20.271025 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.271030 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-trusted-ca-bundle\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:57:20.271236 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.271040 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-oauth-config\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:57:20.271236 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.271048 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-service-ca\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:57:20.271236 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.271056 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/abd53352-97ee-4fd9-86bf-f96dd62d3a92-console-serving-cert\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:57:20.271236 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.271065 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/abd53352-97ee-4fd9-86bf-f96dd62d3a92-oauth-serving-cert\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 21:57:20.853343 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.853318 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74f4fd8fbc-m7zxp_abd53352-97ee-4fd9-86bf-f96dd62d3a92/console/0.log" Apr 20 21:57:20.853781 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.853355 2566 generic.go:358] "Generic (PLEG): container finished" podID="abd53352-97ee-4fd9-86bf-f96dd62d3a92" containerID="9591c4e4532a9965ce9cdb5b87b01d8420a486e98ab155d084d2de7fc8187723" exitCode=2 Apr 20 21:57:20.853781 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.853387 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74f4fd8fbc-m7zxp" event={"ID":"abd53352-97ee-4fd9-86bf-f96dd62d3a92","Type":"ContainerDied","Data":"9591c4e4532a9965ce9cdb5b87b01d8420a486e98ab155d084d2de7fc8187723"} Apr 20 21:57:20.853781 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.853416 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74f4fd8fbc-m7zxp" Apr 20 21:57:20.853781 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.853435 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74f4fd8fbc-m7zxp" event={"ID":"abd53352-97ee-4fd9-86bf-f96dd62d3a92","Type":"ContainerDied","Data":"58d17e5c5da77054293aafef2138b6746d9a74a58e23a8164a87a9ebca2bfa84"} Apr 20 21:57:20.853781 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.853457 2566 scope.go:117] "RemoveContainer" containerID="9591c4e4532a9965ce9cdb5b87b01d8420a486e98ab155d084d2de7fc8187723" Apr 20 21:57:20.862896 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.862879 2566 scope.go:117] "RemoveContainer" containerID="9591c4e4532a9965ce9cdb5b87b01d8420a486e98ab155d084d2de7fc8187723" Apr 20 21:57:20.863135 ip-10-0-137-199 kubenswrapper[2566]: E0420 21:57:20.863116 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9591c4e4532a9965ce9cdb5b87b01d8420a486e98ab155d084d2de7fc8187723\": container with ID starting with 9591c4e4532a9965ce9cdb5b87b01d8420a486e98ab155d084d2de7fc8187723 not found: ID does not exist" containerID="9591c4e4532a9965ce9cdb5b87b01d8420a486e98ab155d084d2de7fc8187723" Apr 20 21:57:20.863179 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.863145 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9591c4e4532a9965ce9cdb5b87b01d8420a486e98ab155d084d2de7fc8187723"} err="failed to get container status \"9591c4e4532a9965ce9cdb5b87b01d8420a486e98ab155d084d2de7fc8187723\": rpc error: code = NotFound desc = could not find container \"9591c4e4532a9965ce9cdb5b87b01d8420a486e98ab155d084d2de7fc8187723\": container with ID starting with 9591c4e4532a9965ce9cdb5b87b01d8420a486e98ab155d084d2de7fc8187723 not found: ID does not exist" Apr 20 21:57:20.875459 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.875435 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74f4fd8fbc-m7zxp"] Apr 20 21:57:20.880184 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:20.880166 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74f4fd8fbc-m7zxp"] Apr 20 21:57:21.774648 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:21.774609 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abd53352-97ee-4fd9-86bf-f96dd62d3a92" path="/var/lib/kubelet/pods/abd53352-97ee-4fd9-86bf-f96dd62d3a92/volumes" Apr 20 21:57:23.829419 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:57:23.829390 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" Apr 20 21:58:05.367251 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.367218 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-4rvxn"] Apr 20 21:58:05.367836 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.367634 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abd53352-97ee-4fd9-86bf-f96dd62d3a92" containerName="console" Apr 20 21:58:05.367836 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.367646 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd53352-97ee-4fd9-86bf-f96dd62d3a92" containerName="console" Apr 20 21:58:05.367836 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.367715 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="abd53352-97ee-4fd9-86bf-f96dd62d3a92" containerName="console" Apr 20 21:58:05.371761 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.371730 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-4rvxn" Apr 20 21:58:05.382748 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.381769 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-fs8hd\"" Apr 20 21:58:05.382748 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.382215 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 21:58:05.389665 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.389639 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-4rvxn"] Apr 20 21:58:05.441476 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.441446 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2f5e5179-dc44-4e9b-8e10-6649652bec36-data\") pod \"postgres-868db5846d-4rvxn\" (UID: \"2f5e5179-dc44-4e9b-8e10-6649652bec36\") " pod="opendatahub/postgres-868db5846d-4rvxn" Apr 20 21:58:05.441476 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.441481 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9gb5\" (UniqueName: \"kubernetes.io/projected/2f5e5179-dc44-4e9b-8e10-6649652bec36-kube-api-access-h9gb5\") pod \"postgres-868db5846d-4rvxn\" (UID: \"2f5e5179-dc44-4e9b-8e10-6649652bec36\") " pod="opendatahub/postgres-868db5846d-4rvxn" Apr 20 21:58:05.542906 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.542870 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2f5e5179-dc44-4e9b-8e10-6649652bec36-data\") pod \"postgres-868db5846d-4rvxn\" (UID: \"2f5e5179-dc44-4e9b-8e10-6649652bec36\") " pod="opendatahub/postgres-868db5846d-4rvxn" Apr 20 21:58:05.542906 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.542906 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9gb5\" (UniqueName: \"kubernetes.io/projected/2f5e5179-dc44-4e9b-8e10-6649652bec36-kube-api-access-h9gb5\") pod \"postgres-868db5846d-4rvxn\" (UID: \"2f5e5179-dc44-4e9b-8e10-6649652bec36\") " pod="opendatahub/postgres-868db5846d-4rvxn" Apr 20 21:58:05.543266 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.543247 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2f5e5179-dc44-4e9b-8e10-6649652bec36-data\") pod \"postgres-868db5846d-4rvxn\" (UID: \"2f5e5179-dc44-4e9b-8e10-6649652bec36\") " pod="opendatahub/postgres-868db5846d-4rvxn" Apr 20 21:58:05.550441 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.550422 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9gb5\" (UniqueName: \"kubernetes.io/projected/2f5e5179-dc44-4e9b-8e10-6649652bec36-kube-api-access-h9gb5\") pod \"postgres-868db5846d-4rvxn\" (UID: \"2f5e5179-dc44-4e9b-8e10-6649652bec36\") " pod="opendatahub/postgres-868db5846d-4rvxn" Apr 20 21:58:05.695753 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.695682 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-4rvxn" Apr 20 21:58:05.814107 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.814079 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-4rvxn"] Apr 20 21:58:05.816157 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:58:05.816130 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f5e5179_dc44_4e9b_8e10_6649652bec36.slice/crio-eb21b8bf648f1a83e13ffa831a20d18e489ee9e26597e3363905d562a4ac4074 WatchSource:0}: Error finding container eb21b8bf648f1a83e13ffa831a20d18e489ee9e26597e3363905d562a4ac4074: Status 404 returned error can't find the container with id eb21b8bf648f1a83e13ffa831a20d18e489ee9e26597e3363905d562a4ac4074 Apr 20 21:58:05.817372 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:05.817354 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:58:06.041036 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:06.041003 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-4rvxn" event={"ID":"2f5e5179-dc44-4e9b-8e10-6649652bec36","Type":"ContainerStarted","Data":"eb21b8bf648f1a83e13ffa831a20d18e489ee9e26597e3363905d562a4ac4074"} Apr 20 21:58:11.066440 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:11.066406 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-4rvxn" event={"ID":"2f5e5179-dc44-4e9b-8e10-6649652bec36","Type":"ContainerStarted","Data":"c9ea2f1c935c4a9e8f5642df52f43a482d7c3ca2ad060306a22ca7f7a747d641"} Apr 20 21:58:11.066934 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:11.066535 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-4rvxn" Apr 20 21:58:11.085225 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:11.085184 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-4rvxn" podStartSLOduration=1.223328578 podStartE2EDuration="6.085173492s" podCreationTimestamp="2026-04-20 21:58:05 +0000 UTC" firstStartedPulling="2026-04-20 21:58:05.817476555 +0000 UTC m=+654.626163899" lastFinishedPulling="2026-04-20 21:58:10.679321469 +0000 UTC m=+659.488008813" observedRunningTime="2026-04-20 21:58:11.083094011 +0000 UTC m=+659.891781375" watchObservedRunningTime="2026-04-20 21:58:11.085173492 +0000 UTC m=+659.893860851" Apr 20 21:58:17.098264 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:58:17.098235 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-4rvxn" Apr 20 21:59:16.959375 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:16.959336 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c"] Apr 20 21:59:16.961995 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:16.961979 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:16.964692 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:16.964666 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-wxx6x\"" Apr 20 21:59:16.964809 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:16.964710 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 21:59:16.965971 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:16.965953 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 21:59:16.966024 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:16.965976 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 21:59:16.973564 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:16.973544 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c"] Apr 20 21:59:17.132128 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.132093 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.132128 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.132126 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.132416 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.132156 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.132416 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.132324 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.132416 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.132374 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b9qw\" (UniqueName: \"kubernetes.io/projected/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-kube-api-access-7b9qw\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.132535 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.132427 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.233276 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.233168 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.233276 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.233218 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.233542 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.233335 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.233542 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.233435 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.233663 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.233595 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.233663 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.233647 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7b9qw\" (UniqueName: \"kubernetes.io/projected/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-kube-api-access-7b9qw\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.233751 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.233703 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.233751 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.233724 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.233952 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.233929 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.236194 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.236166 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.236401 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.236383 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.241457 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.241434 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b9qw\" (UniqueName: \"kubernetes.io/projected/7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d-kube-api-access-7b9qw\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c\" (UID: \"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.273163 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.273138 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:17.613218 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:17.613166 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c"] Apr 20 21:59:17.614677 ip-10-0-137-199 kubenswrapper[2566]: W0420 21:59:17.614638 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f4f5d41_9a15_453e_8a7c_0f5ebdc6543d.slice/crio-64d91b394172fa53ebeccde78753f19082e738c5f68bc013b0d07351d5978f52 WatchSource:0}: Error finding container 64d91b394172fa53ebeccde78753f19082e738c5f68bc013b0d07351d5978f52: Status 404 returned error can't find the container with id 64d91b394172fa53ebeccde78753f19082e738c5f68bc013b0d07351d5978f52 Apr 20 21:59:18.333034 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:18.332992 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" event={"ID":"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d","Type":"ContainerStarted","Data":"64d91b394172fa53ebeccde78753f19082e738c5f68bc013b0d07351d5978f52"} Apr 20 21:59:23.362991 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:23.362951 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" event={"ID":"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d","Type":"ContainerStarted","Data":"2a328c83bdb6af3e527bbd91974b2488fc7d1ed35c2c5a34593397ce0dcb701b"} Apr 20 21:59:29.388812 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:29.388778 2566 generic.go:358] "Generic (PLEG): container finished" podID="7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d" containerID="2a328c83bdb6af3e527bbd91974b2488fc7d1ed35c2c5a34593397ce0dcb701b" exitCode=0 Apr 20 21:59:29.389306 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:29.388852 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" event={"ID":"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d","Type":"ContainerDied","Data":"2a328c83bdb6af3e527bbd91974b2488fc7d1ed35c2c5a34593397ce0dcb701b"} Apr 20 21:59:31.397856 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:31.397823 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" event={"ID":"7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d","Type":"ContainerStarted","Data":"f2406de47881826b83b6265b93aa11f5de514252228aed28c73b63213a6eb3c7"} Apr 20 21:59:31.398253 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:31.398038 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 21:59:31.417581 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:31.417533 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" podStartSLOduration=2.556199071 podStartE2EDuration="15.417504133s" podCreationTimestamp="2026-04-20 21:59:16 +0000 UTC" firstStartedPulling="2026-04-20 21:59:17.617033303 +0000 UTC m=+726.425720649" lastFinishedPulling="2026-04-20 21:59:30.478338365 +0000 UTC m=+739.287025711" observedRunningTime="2026-04-20 21:59:31.414978104 +0000 UTC m=+740.223665471" watchObservedRunningTime="2026-04-20 21:59:31.417504133 +0000 UTC m=+740.226191498" Apr 20 21:59:42.426550 ip-10-0-137-199 kubenswrapper[2566]: I0420 21:59:42.426520 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c" Apr 20 22:00:00.133664 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:00.133632 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29612040-hhdg9"] Apr 20 22:00:00.167957 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:00.167919 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612040-hhdg9"] Apr 20 22:00:00.168106 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:00.168069 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" Apr 20 22:00:00.170607 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:00.170590 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-2zcsj\"" Apr 20 22:00:00.323029 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:00.322999 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c4rs\" (UniqueName: \"kubernetes.io/projected/3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2-kube-api-access-9c4rs\") pod \"maas-api-key-cleanup-29612040-hhdg9\" (UID: \"3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2\") " pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" Apr 20 22:00:00.423559 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:00.423480 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9c4rs\" (UniqueName: \"kubernetes.io/projected/3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2-kube-api-access-9c4rs\") pod \"maas-api-key-cleanup-29612040-hhdg9\" (UID: \"3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2\") " pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" Apr 20 22:00:00.433725 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:00.433698 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c4rs\" (UniqueName: \"kubernetes.io/projected/3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2-kube-api-access-9c4rs\") pod \"maas-api-key-cleanup-29612040-hhdg9\" (UID: \"3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2\") " pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" Apr 20 22:00:00.478166 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:00.478123 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" Apr 20 22:00:00.616508 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:00.616436 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612040-hhdg9"] Apr 20 22:00:00.626581 ip-10-0-137-199 kubenswrapper[2566]: W0420 22:00:00.626541 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d7b26a9_ec0d_43f9_bbda_dd45ea6ac3a2.slice/crio-e0904c22dc87096ed0dc430b5f3a2a03b11f7e2b0a20ed39046116d3fb4be975 WatchSource:0}: Error finding container e0904c22dc87096ed0dc430b5f3a2a03b11f7e2b0a20ed39046116d3fb4be975: Status 404 returned error can't find the container with id e0904c22dc87096ed0dc430b5f3a2a03b11f7e2b0a20ed39046116d3fb4be975 Apr 20 22:00:01.535606 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:01.535565 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" event={"ID":"3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2","Type":"ContainerStarted","Data":"e0904c22dc87096ed0dc430b5f3a2a03b11f7e2b0a20ed39046116d3fb4be975"} Apr 20 22:00:04.554480 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:04.554439 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" event={"ID":"3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2","Type":"ContainerStarted","Data":"8e50956826204d1c5ecd63da0fa85606dd9708e2666ee655153cc1dbafc5aede"} Apr 20 22:00:04.571308 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:04.571246 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" podStartSLOduration=1.545481771 podStartE2EDuration="4.571231254s" podCreationTimestamp="2026-04-20 22:00:00 +0000 UTC" firstStartedPulling="2026-04-20 22:00:00.629294852 +0000 UTC m=+769.437982209" lastFinishedPulling="2026-04-20 22:00:03.655044346 +0000 UTC m=+772.463731692" observedRunningTime="2026-04-20 22:00:04.56838087 +0000 UTC m=+773.377068235" watchObservedRunningTime="2026-04-20 22:00:04.571231254 +0000 UTC m=+773.379918619" Apr 20 22:00:13.263855 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.263817 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4"] Apr 20 22:00:13.268434 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.268401 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.272498 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.272475 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 22:00:13.275832 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.275811 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4"] Apr 20 22:00:13.341265 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.341221 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wk6h\" (UniqueName: \"kubernetes.io/projected/069ce815-d122-4308-a1af-5f1e8c6deea4-kube-api-access-4wk6h\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.341442 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.341275 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/069ce815-d122-4308-a1af-5f1e8c6deea4-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.341442 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.341330 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/069ce815-d122-4308-a1af-5f1e8c6deea4-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.341442 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.341373 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/069ce815-d122-4308-a1af-5f1e8c6deea4-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.341619 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.341461 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/069ce815-d122-4308-a1af-5f1e8c6deea4-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.341619 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.341500 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/069ce815-d122-4308-a1af-5f1e8c6deea4-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.442169 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.442134 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/069ce815-d122-4308-a1af-5f1e8c6deea4-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.442373 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.442190 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/069ce815-d122-4308-a1af-5f1e8c6deea4-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.442373 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.442238 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wk6h\" (UniqueName: \"kubernetes.io/projected/069ce815-d122-4308-a1af-5f1e8c6deea4-kube-api-access-4wk6h\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.442373 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.442268 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/069ce815-d122-4308-a1af-5f1e8c6deea4-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.442373 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.442323 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/069ce815-d122-4308-a1af-5f1e8c6deea4-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.442373 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.442359 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/069ce815-d122-4308-a1af-5f1e8c6deea4-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.442661 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.442639 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/069ce815-d122-4308-a1af-5f1e8c6deea4-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.442729 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.442657 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/069ce815-d122-4308-a1af-5f1e8c6deea4-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.442787 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.442756 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/069ce815-d122-4308-a1af-5f1e8c6deea4-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.445330 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.445275 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/069ce815-d122-4308-a1af-5f1e8c6deea4-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.445460 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.445441 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/069ce815-d122-4308-a1af-5f1e8c6deea4-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.449930 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.449911 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wk6h\" (UniqueName: \"kubernetes.io/projected/069ce815-d122-4308-a1af-5f1e8c6deea4-kube-api-access-4wk6h\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4\" (UID: \"069ce815-d122-4308-a1af-5f1e8c6deea4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.580180 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.580099 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:13.720448 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:13.720409 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4"] Apr 20 22:00:13.724656 ip-10-0-137-199 kubenswrapper[2566]: W0420 22:00:13.724622 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod069ce815_d122_4308_a1af_5f1e8c6deea4.slice/crio-19f6ed0a22ead83afced278865d1f365a1846b090400b035c70e1d618ca294b5 WatchSource:0}: Error finding container 19f6ed0a22ead83afced278865d1f365a1846b090400b035c70e1d618ca294b5: Status 404 returned error can't find the container with id 19f6ed0a22ead83afced278865d1f365a1846b090400b035c70e1d618ca294b5 Apr 20 22:00:14.601843 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:14.601807 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" event={"ID":"069ce815-d122-4308-a1af-5f1e8c6deea4","Type":"ContainerStarted","Data":"b68fdfd7da7e621a59ef5b5bfbfc95c08cc8813bb95aa79e594bb77ad19a5082"} Apr 20 22:00:14.601843 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:14.601850 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" event={"ID":"069ce815-d122-4308-a1af-5f1e8c6deea4","Type":"ContainerStarted","Data":"19f6ed0a22ead83afced278865d1f365a1846b090400b035c70e1d618ca294b5"} Apr 20 22:00:19.625579 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:19.625543 2566 generic.go:358] "Generic (PLEG): container finished" podID="069ce815-d122-4308-a1af-5f1e8c6deea4" containerID="b68fdfd7da7e621a59ef5b5bfbfc95c08cc8813bb95aa79e594bb77ad19a5082" exitCode=0 Apr 20 22:00:19.626003 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:19.625619 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" event={"ID":"069ce815-d122-4308-a1af-5f1e8c6deea4","Type":"ContainerDied","Data":"b68fdfd7da7e621a59ef5b5bfbfc95c08cc8813bb95aa79e594bb77ad19a5082"} Apr 20 22:00:24.649724 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:24.649634 2566 generic.go:358] "Generic (PLEG): container finished" podID="3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" containerID="8e50956826204d1c5ecd63da0fa85606dd9708e2666ee655153cc1dbafc5aede" exitCode=6 Apr 20 22:00:24.649724 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:24.649690 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" event={"ID":"3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2","Type":"ContainerDied","Data":"8e50956826204d1c5ecd63da0fa85606dd9708e2666ee655153cc1dbafc5aede"} Apr 20 22:00:24.650117 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:24.649982 2566 scope.go:117] "RemoveContainer" containerID="8e50956826204d1c5ecd63da0fa85606dd9708e2666ee655153cc1dbafc5aede" Apr 20 22:00:25.655708 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:25.655677 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" event={"ID":"3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2","Type":"ContainerStarted","Data":"286287fc0c8fb954e845c7623c94e793a7c63d7cced5719af68dad2acf063ea1"} Apr 20 22:00:32.686796 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:32.686707 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" event={"ID":"069ce815-d122-4308-a1af-5f1e8c6deea4","Type":"ContainerStarted","Data":"73fe41de58c204b142a3574e23faf4e4a52c0f6517926f2ff78b452efc83fdbc"} Apr 20 22:00:32.687241 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:32.686937 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:32.706517 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:32.706467 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" podStartSLOduration=6.9793868759999995 podStartE2EDuration="19.706454306s" podCreationTimestamp="2026-04-20 22:00:13 +0000 UTC" firstStartedPulling="2026-04-20 22:00:19.626193899 +0000 UTC m=+788.434881242" lastFinishedPulling="2026-04-20 22:00:32.353261324 +0000 UTC m=+801.161948672" observedRunningTime="2026-04-20 22:00:32.705048318 +0000 UTC m=+801.513735696" watchObservedRunningTime="2026-04-20 22:00:32.706454306 +0000 UTC m=+801.515141700" Apr 20 22:00:43.703847 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:43.703816 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4" Apr 20 22:00:45.741795 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:45.741767 2566 generic.go:358] "Generic (PLEG): container finished" podID="3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" containerID="286287fc0c8fb954e845c7623c94e793a7c63d7cced5719af68dad2acf063ea1" exitCode=6 Apr 20 22:00:45.742216 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:45.741841 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" event={"ID":"3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2","Type":"ContainerDied","Data":"286287fc0c8fb954e845c7623c94e793a7c63d7cced5719af68dad2acf063ea1"} Apr 20 22:00:45.742216 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:45.741882 2566 scope.go:117] "RemoveContainer" containerID="8e50956826204d1c5ecd63da0fa85606dd9708e2666ee655153cc1dbafc5aede" Apr 20 22:00:45.742397 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:45.742240 2566 scope.go:117] "RemoveContainer" containerID="286287fc0c8fb954e845c7623c94e793a7c63d7cced5719af68dad2acf063ea1" Apr 20 22:00:45.742529 ip-10-0-137-199 kubenswrapper[2566]: E0420 22:00:45.742508 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29612040-hhdg9_opendatahub(3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2)\"" pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" podUID="3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" Apr 20 22:00:57.769935 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:57.769906 2566 scope.go:117] "RemoveContainer" containerID="286287fc0c8fb954e845c7623c94e793a7c63d7cced5719af68dad2acf063ea1" Apr 20 22:00:58.797332 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:58.797274 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" event={"ID":"3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2","Type":"ContainerStarted","Data":"dbdcce47375d447e1857d2f2124dda4b5bddf12168d963b09e13b0d9037b7bae"} Apr 20 22:00:59.826508 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:59.826476 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612040-hhdg9"] Apr 20 22:00:59.826879 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:00:59.826658 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" podUID="3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" containerName="cleanup" containerID="cri-o://dbdcce47375d447e1857d2f2124dda4b5bddf12168d963b09e13b0d9037b7bae" gracePeriod=30 Apr 20 22:01:18.465514 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.465493 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" Apr 20 22:01:18.521245 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.521215 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c4rs\" (UniqueName: \"kubernetes.io/projected/3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2-kube-api-access-9c4rs\") pod \"3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2\" (UID: \"3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2\") " Apr 20 22:01:18.523486 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.523456 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2-kube-api-access-9c4rs" (OuterVolumeSpecName: "kube-api-access-9c4rs") pod "3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" (UID: "3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2"). InnerVolumeSpecName "kube-api-access-9c4rs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:01:18.622158 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.622089 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9c4rs\" (UniqueName: \"kubernetes.io/projected/3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2-kube-api-access-9c4rs\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 22:01:18.878860 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.878786 2566 generic.go:358] "Generic (PLEG): container finished" podID="3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" containerID="dbdcce47375d447e1857d2f2124dda4b5bddf12168d963b09e13b0d9037b7bae" exitCode=6 Apr 20 22:01:18.878860 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.878854 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" Apr 20 22:01:18.879027 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.878848 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" event={"ID":"3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2","Type":"ContainerDied","Data":"dbdcce47375d447e1857d2f2124dda4b5bddf12168d963b09e13b0d9037b7bae"} Apr 20 22:01:18.879027 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.878957 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612040-hhdg9" event={"ID":"3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2","Type":"ContainerDied","Data":"e0904c22dc87096ed0dc430b5f3a2a03b11f7e2b0a20ed39046116d3fb4be975"} Apr 20 22:01:18.879027 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.878972 2566 scope.go:117] "RemoveContainer" containerID="dbdcce47375d447e1857d2f2124dda4b5bddf12168d963b09e13b0d9037b7bae" Apr 20 22:01:18.889930 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.889911 2566 scope.go:117] "RemoveContainer" containerID="286287fc0c8fb954e845c7623c94e793a7c63d7cced5719af68dad2acf063ea1" Apr 20 22:01:18.897676 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.897660 2566 scope.go:117] "RemoveContainer" containerID="dbdcce47375d447e1857d2f2124dda4b5bddf12168d963b09e13b0d9037b7bae" Apr 20 22:01:18.897926 ip-10-0-137-199 kubenswrapper[2566]: E0420 22:01:18.897909 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbdcce47375d447e1857d2f2124dda4b5bddf12168d963b09e13b0d9037b7bae\": container with ID starting with dbdcce47375d447e1857d2f2124dda4b5bddf12168d963b09e13b0d9037b7bae not found: ID does not exist" containerID="dbdcce47375d447e1857d2f2124dda4b5bddf12168d963b09e13b0d9037b7bae" Apr 20 22:01:18.897974 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.897936 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbdcce47375d447e1857d2f2124dda4b5bddf12168d963b09e13b0d9037b7bae"} err="failed to get container status \"dbdcce47375d447e1857d2f2124dda4b5bddf12168d963b09e13b0d9037b7bae\": rpc error: code = NotFound desc = could not find container \"dbdcce47375d447e1857d2f2124dda4b5bddf12168d963b09e13b0d9037b7bae\": container with ID starting with dbdcce47375d447e1857d2f2124dda4b5bddf12168d963b09e13b0d9037b7bae not found: ID does not exist" Apr 20 22:01:18.897974 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.897953 2566 scope.go:117] "RemoveContainer" containerID="286287fc0c8fb954e845c7623c94e793a7c63d7cced5719af68dad2acf063ea1" Apr 20 22:01:18.898190 ip-10-0-137-199 kubenswrapper[2566]: E0420 22:01:18.898174 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"286287fc0c8fb954e845c7623c94e793a7c63d7cced5719af68dad2acf063ea1\": container with ID starting with 286287fc0c8fb954e845c7623c94e793a7c63d7cced5719af68dad2acf063ea1 not found: ID does not exist" containerID="286287fc0c8fb954e845c7623c94e793a7c63d7cced5719af68dad2acf063ea1" Apr 20 22:01:18.898235 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.898197 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286287fc0c8fb954e845c7623c94e793a7c63d7cced5719af68dad2acf063ea1"} err="failed to get container status \"286287fc0c8fb954e845c7623c94e793a7c63d7cced5719af68dad2acf063ea1\": rpc error: code = NotFound desc = could not find container \"286287fc0c8fb954e845c7623c94e793a7c63d7cced5719af68dad2acf063ea1\": container with ID starting with 286287fc0c8fb954e845c7623c94e793a7c63d7cced5719af68dad2acf063ea1 not found: ID does not exist" Apr 20 22:01:18.903873 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.903854 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612040-hhdg9"] Apr 20 22:01:18.907382 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:18.907363 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612040-hhdg9"] Apr 20 22:01:19.774498 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:01:19.774472 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" path="/var/lib/kubelet/pods/3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2/volumes" Apr 20 22:02:11.843707 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:02:11.843677 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 22:02:11.845030 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:02:11.845009 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 22:07:11.879350 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:07:11.879259 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 22:07:11.883074 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:07:11.883055 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 22:12:11.913203 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:11.913175 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 22:12:11.920570 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:11.920549 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 22:12:25.768265 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:25.768231 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj"] Apr 20 22:12:25.768796 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:25.768465 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" podUID="a08ad341-1cc2-40b8-a166-d96dbe06bcc6" containerName="manager" containerID="cri-o://04bc63e01756a28472473a87904245568b82f576a7ce0690c2064e29f28f948f" gracePeriod=10 Apr 20 22:12:26.015315 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.015268 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" Apr 20 22:12:26.069794 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.069719 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a08ad341-1cc2-40b8-a166-d96dbe06bcc6-extensions-socket-volume\") pod \"a08ad341-1cc2-40b8-a166-d96dbe06bcc6\" (UID: \"a08ad341-1cc2-40b8-a166-d96dbe06bcc6\") " Apr 20 22:12:26.069915 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.069838 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldvg6\" (UniqueName: \"kubernetes.io/projected/a08ad341-1cc2-40b8-a166-d96dbe06bcc6-kube-api-access-ldvg6\") pod \"a08ad341-1cc2-40b8-a166-d96dbe06bcc6\" (UID: \"a08ad341-1cc2-40b8-a166-d96dbe06bcc6\") " Apr 20 22:12:26.070093 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.070071 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a08ad341-1cc2-40b8-a166-d96dbe06bcc6-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "a08ad341-1cc2-40b8-a166-d96dbe06bcc6" (UID: "a08ad341-1cc2-40b8-a166-d96dbe06bcc6"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:12:26.071844 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.071819 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08ad341-1cc2-40b8-a166-d96dbe06bcc6-kube-api-access-ldvg6" (OuterVolumeSpecName: "kube-api-access-ldvg6") pod "a08ad341-1cc2-40b8-a166-d96dbe06bcc6" (UID: "a08ad341-1cc2-40b8-a166-d96dbe06bcc6"). InnerVolumeSpecName "kube-api-access-ldvg6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:12:26.171116 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.171074 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ldvg6\" (UniqueName: \"kubernetes.io/projected/a08ad341-1cc2-40b8-a166-d96dbe06bcc6-kube-api-access-ldvg6\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 22:12:26.171116 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.171111 2566 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a08ad341-1cc2-40b8-a166-d96dbe06bcc6-extensions-socket-volume\") on node \"ip-10-0-137-199.ec2.internal\" DevicePath \"\"" Apr 20 22:12:26.624118 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.624081 2566 generic.go:358] "Generic (PLEG): container finished" podID="a08ad341-1cc2-40b8-a166-d96dbe06bcc6" containerID="04bc63e01756a28472473a87904245568b82f576a7ce0690c2064e29f28f948f" exitCode=0 Apr 20 22:12:26.624333 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.624161 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" Apr 20 22:12:26.624333 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.624165 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" event={"ID":"a08ad341-1cc2-40b8-a166-d96dbe06bcc6","Type":"ContainerDied","Data":"04bc63e01756a28472473a87904245568b82f576a7ce0690c2064e29f28f948f"} Apr 20 22:12:26.624333 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.624203 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj" event={"ID":"a08ad341-1cc2-40b8-a166-d96dbe06bcc6","Type":"ContainerDied","Data":"f549338eb955f528a495fd4359cb74618b95e12e2170a6438c253d19a65827a5"} Apr 20 22:12:26.624333 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.624218 2566 scope.go:117] "RemoveContainer" containerID="04bc63e01756a28472473a87904245568b82f576a7ce0690c2064e29f28f948f" Apr 20 22:12:26.633701 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.633685 2566 scope.go:117] "RemoveContainer" containerID="04bc63e01756a28472473a87904245568b82f576a7ce0690c2064e29f28f948f" Apr 20 22:12:26.633967 ip-10-0-137-199 kubenswrapper[2566]: E0420 22:12:26.633947 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04bc63e01756a28472473a87904245568b82f576a7ce0690c2064e29f28f948f\": container with ID starting with 04bc63e01756a28472473a87904245568b82f576a7ce0690c2064e29f28f948f not found: ID does not exist" containerID="04bc63e01756a28472473a87904245568b82f576a7ce0690c2064e29f28f948f" Apr 20 22:12:26.634021 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.633976 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04bc63e01756a28472473a87904245568b82f576a7ce0690c2064e29f28f948f"} err="failed to get container status \"04bc63e01756a28472473a87904245568b82f576a7ce0690c2064e29f28f948f\": rpc error: code = NotFound desc = could not find container \"04bc63e01756a28472473a87904245568b82f576a7ce0690c2064e29f28f948f\": container with ID starting with 04bc63e01756a28472473a87904245568b82f576a7ce0690c2064e29f28f948f not found: ID does not exist" Apr 20 22:12:26.646981 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.646960 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj"] Apr 20 22:12:26.650268 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:26.650245 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-q5ftj"] Apr 20 22:12:27.775639 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:12:27.775594 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a08ad341-1cc2-40b8-a166-d96dbe06bcc6" path="/var/lib/kubelet/pods/a08ad341-1cc2-40b8-a166-d96dbe06bcc6/volumes" Apr 20 22:13:33.775375 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:33.775347 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95"] Apr 20 22:13:33.775802 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:33.775693 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" containerName="cleanup" Apr 20 22:13:33.775802 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:33.775703 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" containerName="cleanup" Apr 20 22:13:33.775802 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:33.775724 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a08ad341-1cc2-40b8-a166-d96dbe06bcc6" containerName="manager" Apr 20 22:13:33.775802 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:33.775729 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08ad341-1cc2-40b8-a166-d96dbe06bcc6" containerName="manager" Apr 20 22:13:33.775802 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:33.775738 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" containerName="cleanup" Apr 20 22:13:33.775802 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:33.775743 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" containerName="cleanup" Apr 20 22:13:33.775802 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:33.775798 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" containerName="cleanup" Apr 20 22:13:33.776017 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:33.775808 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="a08ad341-1cc2-40b8-a166-d96dbe06bcc6" containerName="manager" Apr 20 22:13:33.776017 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:33.775816 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" containerName="cleanup" Apr 20 22:13:33.778793 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:33.778746 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95" Apr 20 22:13:33.781570 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:33.781548 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-htr2p\"" Apr 20 22:13:33.789642 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:33.789612 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95"] Apr 20 22:13:33.948625 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:33.948598 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clk5g\" (UniqueName: \"kubernetes.io/projected/950a6670-d4c9-42ee-b492-d8cad87bed78-kube-api-access-clk5g\") pod \"kuadrant-operator-controller-manager-55c7f4c975-94m95\" (UID: \"950a6670-d4c9-42ee-b492-d8cad87bed78\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95" Apr 20 22:13:33.948788 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:33.948637 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/950a6670-d4c9-42ee-b492-d8cad87bed78-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-94m95\" (UID: \"950a6670-d4c9-42ee-b492-d8cad87bed78\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95" Apr 20 22:13:34.049580 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:34.049497 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clk5g\" (UniqueName: \"kubernetes.io/projected/950a6670-d4c9-42ee-b492-d8cad87bed78-kube-api-access-clk5g\") pod \"kuadrant-operator-controller-manager-55c7f4c975-94m95\" (UID: \"950a6670-d4c9-42ee-b492-d8cad87bed78\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95" Apr 20 22:13:34.049580 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:34.049552 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/950a6670-d4c9-42ee-b492-d8cad87bed78-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-94m95\" (UID: \"950a6670-d4c9-42ee-b492-d8cad87bed78\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95" Apr 20 22:13:34.049938 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:34.049918 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/950a6670-d4c9-42ee-b492-d8cad87bed78-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-94m95\" (UID: \"950a6670-d4c9-42ee-b492-d8cad87bed78\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95" Apr 20 22:13:34.057860 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:34.057837 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clk5g\" (UniqueName: \"kubernetes.io/projected/950a6670-d4c9-42ee-b492-d8cad87bed78-kube-api-access-clk5g\") pod \"kuadrant-operator-controller-manager-55c7f4c975-94m95\" (UID: \"950a6670-d4c9-42ee-b492-d8cad87bed78\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95" Apr 20 22:13:34.089468 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:34.089444 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95" Apr 20 22:13:34.240606 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:34.240581 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95"] Apr 20 22:13:34.243218 ip-10-0-137-199 kubenswrapper[2566]: W0420 22:13:34.243193 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod950a6670_d4c9_42ee_b492_d8cad87bed78.slice/crio-d99b1d06833dfb9e757625feef757b05277e322fbac8f0f6e3f4927aefc059bd WatchSource:0}: Error finding container d99b1d06833dfb9e757625feef757b05277e322fbac8f0f6e3f4927aefc059bd: Status 404 returned error can't find the container with id d99b1d06833dfb9e757625feef757b05277e322fbac8f0f6e3f4927aefc059bd Apr 20 22:13:34.245433 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:34.245416 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 22:13:34.904661 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:34.904627 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95" event={"ID":"950a6670-d4c9-42ee-b492-d8cad87bed78","Type":"ContainerStarted","Data":"6ecd449aaeddcba8ff03e1bdbc44a5a54fc18063bc04d697889a760bc2779de4"} Apr 20 22:13:34.904661 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:34.904663 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95" event={"ID":"950a6670-d4c9-42ee-b492-d8cad87bed78","Type":"ContainerStarted","Data":"d99b1d06833dfb9e757625feef757b05277e322fbac8f0f6e3f4927aefc059bd"} Apr 20 22:13:34.905049 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:34.904755 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95" Apr 20 22:13:34.927730 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:34.927680 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95" podStartSLOduration=1.927664435 podStartE2EDuration="1.927664435s" podCreationTimestamp="2026-04-20 22:13:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:13:34.924461212 +0000 UTC m=+1583.733148576" watchObservedRunningTime="2026-04-20 22:13:34.927664435 +0000 UTC m=+1583.736351798" Apr 20 22:13:45.911536 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:13:45.911505 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-94m95" Apr 20 22:17:11.948443 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:17:11.948337 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 22:17:11.962812 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:17:11.962785 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 22:22:11.987664 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:22:11.987637 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 22:22:12.003513 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:22:12.003494 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 22:23:15.152973 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:15.152940 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-7xpgh_1aac7f00-8545-40c9-906a-0719d15b0d78/manager/0.log" Apr 20 22:23:15.515359 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:15.515327 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-mmcb7_a2ddb053-015c-4b09-8687-7878eac0bcc5/manager/1.log" Apr 20 22:23:15.750858 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:15.750826 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-f5f47469b-mdkqw_9ea8269c-9fdb-442a-ae81-5d278f62e768/manager/0.log" Apr 20 22:23:15.984551 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:15.984520 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-4rvxn_2f5e5179-dc44-4e9b-8e10-6649652bec36/postgres/0.log" Apr 20 22:23:16.743895 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:16.743866 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc_a245ea99-e813-4fb4-b96c-eb5f7851d73f/util/0.log" Apr 20 22:23:16.750705 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:16.750683 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc_a245ea99-e813-4fb4-b96c-eb5f7851d73f/pull/0.log" Apr 20 22:23:16.756906 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:16.756891 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc_a245ea99-e813-4fb4-b96c-eb5f7851d73f/extract/0.log" Apr 20 22:23:16.860924 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:16.860897 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z_49fd39c2-5908-43b7-9079-8c696fe2d198/util/0.log" Apr 20 22:23:16.867098 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:16.867079 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z_49fd39c2-5908-43b7-9079-8c696fe2d198/pull/0.log" Apr 20 22:23:16.873050 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:16.873034 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z_49fd39c2-5908-43b7-9079-8c696fe2d198/extract/0.log" Apr 20 22:23:16.980588 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:16.980561 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj_ea440a99-84be-442d-a9c0-eb81422af518/extract/0.log" Apr 20 22:23:16.986255 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:16.986237 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj_ea440a99-84be-442d-a9c0-eb81422af518/util/0.log" Apr 20 22:23:16.992102 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:16.992085 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj_ea440a99-84be-442d-a9c0-eb81422af518/pull/0.log" Apr 20 22:23:17.096883 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:17.096806 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr_09a177d4-c3f7-4071-9617-cab492afb928/util/0.log" Apr 20 22:23:17.103206 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:17.103184 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr_09a177d4-c3f7-4071-9617-cab492afb928/pull/0.log" Apr 20 22:23:17.108332 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:17.108316 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr_09a177d4-c3f7-4071-9617-cab492afb928/extract/0.log" Apr 20 22:23:17.355916 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:17.355840 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-8dnwh_2d22820f-e38b-4777-bd21-da858275f747/manager/0.log" Apr 20 22:23:17.471837 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:17.471809 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-zlgxv_699ac8a1-5fe5-49ad-acb8-67aaae92caea/manager/0.log" Apr 20 22:23:17.838023 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:17.837991 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-94m95_950a6670-d4c9-42ee-b492-d8cad87bed78/manager/0.log" Apr 20 22:23:18.542856 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:18.542827 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-bk64z_bd180856-1de8-453a-9572-dac5318b40fb/discovery/0.log" Apr 20 22:23:18.649276 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:18.649222 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-d7f98b469-5hqqz_51b22ece-4292-4ffa-baa4-ff3757373b19/kube-auth-proxy/0.log" Apr 20 22:23:18.986464 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:18.986434 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5b8b45bf6b-bjzvc_dc92671e-7696-4acb-8e57-f2e271dec9f2/router/0.log" Apr 20 22:23:19.329225 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:19.329144 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4_069ce815-d122-4308-a1af-5f1e8c6deea4/storage-initializer/0.log" Apr 20 22:23:19.337588 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:19.337561 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-szmg4_069ce815-d122-4308-a1af-5f1e8c6deea4/main/0.log" Apr 20 22:23:19.682125 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:19.682045 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c_7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d/storage-initializer/0.log" Apr 20 22:23:19.688435 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:19.688413 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck976c_7f4f5d41-9a15-453e-8a7c-0f5ebdc6543d/main/0.log" Apr 20 22:23:26.277626 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:26.277589 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-twt7t_d0ce3c41-e846-4b03-82b0-0fae9d903232/global-pull-secret-syncer/0.log" Apr 20 22:23:26.321487 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:26.321445 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ljbjz_39147d66-f51d-40db-a98e-ac955007f9af/konnectivity-agent/0.log" Apr 20 22:23:26.425467 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:26.425435 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-199.ec2.internal_22aaf800e14f7250fca48df51b47e1cc/haproxy/0.log" Apr 20 22:23:30.305939 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:30.305914 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc_a245ea99-e813-4fb4-b96c-eb5f7851d73f/extract/0.log" Apr 20 22:23:30.328120 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:30.328093 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc_a245ea99-e813-4fb4-b96c-eb5f7851d73f/util/0.log" Apr 20 22:23:30.350121 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:30.350089 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759dtzwc_a245ea99-e813-4fb4-b96c-eb5f7851d73f/pull/0.log" Apr 20 22:23:30.381896 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:30.381862 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z_49fd39c2-5908-43b7-9079-8c696fe2d198/extract/0.log" Apr 20 22:23:30.403365 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:30.403338 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z_49fd39c2-5908-43b7-9079-8c696fe2d198/util/0.log" Apr 20 22:23:30.423242 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:30.423205 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0db98z_49fd39c2-5908-43b7-9079-8c696fe2d198/pull/0.log" Apr 20 22:23:30.450835 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:30.450809 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj_ea440a99-84be-442d-a9c0-eb81422af518/extract/0.log" Apr 20 22:23:30.471070 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:30.471044 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj_ea440a99-84be-442d-a9c0-eb81422af518/util/0.log" Apr 20 22:23:30.490535 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:30.490507 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gdwjj_ea440a99-84be-442d-a9c0-eb81422af518/pull/0.log" Apr 20 22:23:30.517050 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:30.517007 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr_09a177d4-c3f7-4071-9617-cab492afb928/extract/0.log" Apr 20 22:23:30.536256 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:30.536232 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr_09a177d4-c3f7-4071-9617-cab492afb928/util/0.log" Apr 20 22:23:30.555070 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:30.555037 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1qfrnr_09a177d4-c3f7-4071-9617-cab492afb928/pull/0.log" Apr 20 22:23:30.917253 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:30.917226 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-8dnwh_2d22820f-e38b-4777-bd21-da858275f747/manager/0.log" Apr 20 22:23:30.940742 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:30.940718 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-zlgxv_699ac8a1-5fe5-49ad-acb8-67aaae92caea/manager/0.log" Apr 20 22:23:31.078228 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:31.078199 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-94m95_950a6670-d4c9-42ee-b492-d8cad87bed78/manager/0.log" Apr 20 22:23:32.623837 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:32.623807 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e2452fa3-65aa-4e20-9f82-494b57157bab/alertmanager/0.log" Apr 20 22:23:32.643429 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:32.643402 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e2452fa3-65aa-4e20-9f82-494b57157bab/config-reloader/0.log" Apr 20 22:23:32.665783 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:32.665758 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e2452fa3-65aa-4e20-9f82-494b57157bab/kube-rbac-proxy-web/0.log" Apr 20 22:23:32.685230 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:32.685203 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e2452fa3-65aa-4e20-9f82-494b57157bab/kube-rbac-proxy/0.log" Apr 20 22:23:32.712207 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:32.712179 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e2452fa3-65aa-4e20-9f82-494b57157bab/kube-rbac-proxy-metric/0.log" Apr 20 22:23:32.732086 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:32.732057 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e2452fa3-65aa-4e20-9f82-494b57157bab/prom-label-proxy/0.log" Apr 20 22:23:32.751560 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:32.751515 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e2452fa3-65aa-4e20-9f82-494b57157bab/init-config-reloader/0.log" Apr 20 22:23:32.938779 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:32.938691 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5fmtx_b715204d-ff93-47f6-a9fa-362fdf6c9628/node-exporter/0.log" Apr 20 22:23:32.958527 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:32.958498 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5fmtx_b715204d-ff93-47f6-a9fa-362fdf6c9628/kube-rbac-proxy/0.log" Apr 20 22:23:32.989874 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:32.989846 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5fmtx_b715204d-ff93-47f6-a9fa-362fdf6c9628/init-textfile/0.log" Apr 20 22:23:33.154239 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:33.154212 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gv47s_42dc29c7-316e-4d4b-a0e9-ea3a5161453e/kube-rbac-proxy-main/0.log" Apr 20 22:23:33.173065 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:33.173038 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gv47s_42dc29c7-316e-4d4b-a0e9-ea3a5161453e/kube-rbac-proxy-self/0.log" Apr 20 22:23:33.193249 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:33.193153 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gv47s_42dc29c7-316e-4d4b-a0e9-ea3a5161453e/openshift-state-metrics/0.log" Apr 20 22:23:33.387212 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:33.387187 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-42clz_299181ce-50b6-4ee4-bf17-96747e88ebdf/prometheus-operator/0.log" Apr 20 22:23:33.405257 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:33.405217 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-42clz_299181ce-50b6-4ee4-bf17-96747e88ebdf/kube-rbac-proxy/0.log" Apr 20 22:23:33.457422 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:33.457400 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5775b46cc4-lwprx_ca4dd1d5-9a28-42a0-b615-9ab984977a89/telemeter-client/0.log" Apr 20 22:23:33.476162 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:33.476133 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5775b46cc4-lwprx_ca4dd1d5-9a28-42a0-b615-9ab984977a89/reload/0.log" Apr 20 22:23:33.495152 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:33.495113 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5775b46cc4-lwprx_ca4dd1d5-9a28-42a0-b615-9ab984977a89/kube-rbac-proxy/0.log" Apr 20 22:23:33.523423 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:33.523393 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66755bd5f8-nctmr_cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00/thanos-query/0.log" Apr 20 22:23:33.541714 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:33.541690 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66755bd5f8-nctmr_cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00/kube-rbac-proxy-web/0.log" Apr 20 22:23:33.561248 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:33.561222 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66755bd5f8-nctmr_cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00/kube-rbac-proxy/0.log" Apr 20 22:23:33.579724 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:33.579689 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66755bd5f8-nctmr_cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00/prom-label-proxy/0.log" Apr 20 22:23:33.598647 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:33.598606 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66755bd5f8-nctmr_cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00/kube-rbac-proxy-rules/0.log" Apr 20 22:23:33.617302 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:33.617247 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66755bd5f8-nctmr_cfe1d2ca-d19d-496d-b7bf-a3cc60d1cb00/kube-rbac-proxy-metrics/0.log" Apr 20 22:23:35.067355 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.067321 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl"] Apr 20 22:23:35.067763 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.067750 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" containerName="cleanup" Apr 20 22:23:35.067763 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.067765 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" containerName="cleanup" Apr 20 22:23:35.067849 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.067835 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d7b26a9-ec0d-43f9-bbda-dd45ea6ac3a2" containerName="cleanup" Apr 20 22:23:35.071216 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.071198 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.074576 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.074554 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5l2ks\"/\"kube-root-ca.crt\"" Apr 20 22:23:35.074713 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.074586 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5l2ks\"/\"openshift-service-ca.crt\"" Apr 20 22:23:35.075889 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.075865 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5l2ks\"/\"default-dockercfg-wqjtp\"" Apr 20 22:23:35.079881 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.079858 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl"] Apr 20 22:23:35.220534 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.220498 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/016477f1-c844-4573-8e85-56f2b945e7a4-sys\") pod \"perf-node-gather-daemonset-dxkxl\" (UID: \"016477f1-c844-4573-8e85-56f2b945e7a4\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.220751 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.220556 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/016477f1-c844-4573-8e85-56f2b945e7a4-podres\") pod \"perf-node-gather-daemonset-dxkxl\" (UID: \"016477f1-c844-4573-8e85-56f2b945e7a4\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.220751 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.220575 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99hwq\" (UniqueName: \"kubernetes.io/projected/016477f1-c844-4573-8e85-56f2b945e7a4-kube-api-access-99hwq\") pod \"perf-node-gather-daemonset-dxkxl\" (UID: \"016477f1-c844-4573-8e85-56f2b945e7a4\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.220751 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.220714 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/016477f1-c844-4573-8e85-56f2b945e7a4-proc\") pod \"perf-node-gather-daemonset-dxkxl\" (UID: \"016477f1-c844-4573-8e85-56f2b945e7a4\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.220905 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.220756 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/016477f1-c844-4573-8e85-56f2b945e7a4-lib-modules\") pod \"perf-node-gather-daemonset-dxkxl\" (UID: \"016477f1-c844-4573-8e85-56f2b945e7a4\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.322084 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.321989 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/016477f1-c844-4573-8e85-56f2b945e7a4-sys\") pod \"perf-node-gather-daemonset-dxkxl\" (UID: \"016477f1-c844-4573-8e85-56f2b945e7a4\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.322084 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.322064 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/016477f1-c844-4573-8e85-56f2b945e7a4-podres\") pod \"perf-node-gather-daemonset-dxkxl\" (UID: \"016477f1-c844-4573-8e85-56f2b945e7a4\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.322368 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.322088 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99hwq\" (UniqueName: \"kubernetes.io/projected/016477f1-c844-4573-8e85-56f2b945e7a4-kube-api-access-99hwq\") pod \"perf-node-gather-daemonset-dxkxl\" (UID: \"016477f1-c844-4573-8e85-56f2b945e7a4\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.322368 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.322126 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/016477f1-c844-4573-8e85-56f2b945e7a4-sys\") pod \"perf-node-gather-daemonset-dxkxl\" (UID: \"016477f1-c844-4573-8e85-56f2b945e7a4\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.322368 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.322145 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/016477f1-c844-4573-8e85-56f2b945e7a4-proc\") pod \"perf-node-gather-daemonset-dxkxl\" (UID: \"016477f1-c844-4573-8e85-56f2b945e7a4\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.322368 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.322171 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/016477f1-c844-4573-8e85-56f2b945e7a4-lib-modules\") pod \"perf-node-gather-daemonset-dxkxl\" (UID: \"016477f1-c844-4573-8e85-56f2b945e7a4\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.322368 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.322246 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/016477f1-c844-4573-8e85-56f2b945e7a4-podres\") pod \"perf-node-gather-daemonset-dxkxl\" (UID: \"016477f1-c844-4573-8e85-56f2b945e7a4\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.322368 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.322275 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/016477f1-c844-4573-8e85-56f2b945e7a4-proc\") pod \"perf-node-gather-daemonset-dxkxl\" (UID: \"016477f1-c844-4573-8e85-56f2b945e7a4\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.322368 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.322357 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/016477f1-c844-4573-8e85-56f2b945e7a4-lib-modules\") pod \"perf-node-gather-daemonset-dxkxl\" (UID: \"016477f1-c844-4573-8e85-56f2b945e7a4\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.330142 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.330109 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99hwq\" (UniqueName: \"kubernetes.io/projected/016477f1-c844-4573-8e85-56f2b945e7a4-kube-api-access-99hwq\") pod \"perf-node-gather-daemonset-dxkxl\" (UID: \"016477f1-c844-4573-8e85-56f2b945e7a4\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.382092 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.382055 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:35.512467 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.512436 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl"] Apr 20 22:23:35.513922 ip-10-0-137-199 kubenswrapper[2566]: W0420 22:23:35.513889 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod016477f1_c844_4573_8e85_56f2b945e7a4.slice/crio-9f6c3d909798ffb80f399fa3e4b32c9d79f68fec79a2db9d0eae860d891c2473 WatchSource:0}: Error finding container 9f6c3d909798ffb80f399fa3e4b32c9d79f68fec79a2db9d0eae860d891c2473: Status 404 returned error can't find the container with id 9f6c3d909798ffb80f399fa3e4b32c9d79f68fec79a2db9d0eae860d891c2473 Apr 20 22:23:35.515644 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.515625 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 22:23:35.755136 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.755110 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-844ff484b4-jvpzl_1bf2012b-ff7f-450c-beca-96f972eb0894/console/0.log" Apr 20 22:23:35.781505 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:35.781477 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-wz52s_c6a9fd05-b4c9-4635-889f-49259cf8782a/download-server/0.log" Apr 20 22:23:36.378953 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:36.378921 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" event={"ID":"016477f1-c844-4573-8e85-56f2b945e7a4","Type":"ContainerStarted","Data":"69525d5174befe68f5510e8fbefe56c37ce5d3c59e5ae306200c2fc8f0fe4675"} Apr 20 22:23:36.378953 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:36.378956 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" event={"ID":"016477f1-c844-4573-8e85-56f2b945e7a4","Type":"ContainerStarted","Data":"9f6c3d909798ffb80f399fa3e4b32c9d79f68fec79a2db9d0eae860d891c2473"} Apr 20 22:23:36.379418 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:36.379054 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:36.395045 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:36.394989 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" podStartSLOduration=1.3949734280000001 podStartE2EDuration="1.394973428s" podCreationTimestamp="2026-04-20 22:23:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:23:36.392858049 +0000 UTC m=+2185.201545414" watchObservedRunningTime="2026-04-20 22:23:36.394973428 +0000 UTC m=+2185.203660846" Apr 20 22:23:36.994105 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:36.994078 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cdxfg_edf22b9c-7596-4c82-a080-dfbe98377c19/dns/0.log" Apr 20 22:23:37.013468 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:37.013436 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cdxfg_edf22b9c-7596-4c82-a080-dfbe98377c19/kube-rbac-proxy/0.log" Apr 20 22:23:37.165926 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:37.165898 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-t72lj_7201191d-577f-470c-81dc-ec7f86680c09/dns-node-resolver/0.log" Apr 20 22:23:37.626590 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:37.626561 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8b2mt_ad1b62dc-e2e3-4ab8-95b4-0ee7f0e09468/node-ca/0.log" Apr 20 22:23:38.510833 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:38.510805 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-bk64z_bd180856-1de8-453a-9572-dac5318b40fb/discovery/0.log" Apr 20 22:23:38.529325 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:38.529301 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-d7f98b469-5hqqz_51b22ece-4292-4ffa-baa4-ff3757373b19/kube-auth-proxy/0.log" Apr 20 22:23:38.659566 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:38.659537 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5b8b45bf6b-bjzvc_dc92671e-7696-4acb-8e57-f2e271dec9f2/router/0.log" Apr 20 22:23:39.146512 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:39.146487 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7nmnk_3c4537b6-b9be-4eb9-9b2e-2867dc27db2b/serve-healthcheck-canary/0.log" Apr 20 22:23:39.794309 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:39.794268 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-knc69_06adda10-864e-4b54-b6d8-4020aa460197/kube-rbac-proxy/0.log" Apr 20 22:23:39.811627 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:39.811605 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-knc69_06adda10-864e-4b54-b6d8-4020aa460197/exporter/0.log" Apr 20 22:23:39.833719 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:39.833697 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-knc69_06adda10-864e-4b54-b6d8-4020aa460197/extractor/0.log" Apr 20 22:23:41.678541 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:41.678512 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-7xpgh_1aac7f00-8545-40c9-906a-0719d15b0d78/manager/0.log" Apr 20 22:23:41.813762 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:41.813733 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-mmcb7_a2ddb053-015c-4b09-8687-7878eac0bcc5/manager/0.log" Apr 20 22:23:41.825479 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:41.825451 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-mmcb7_a2ddb053-015c-4b09-8687-7878eac0bcc5/manager/1.log" Apr 20 22:23:41.868564 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:41.868540 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-f5f47469b-mdkqw_9ea8269c-9fdb-442a-ae81-5d278f62e768/manager/0.log" Apr 20 22:23:41.938878 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:41.938810 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-4rvxn_2f5e5179-dc44-4e9b-8e10-6649652bec36/postgres/0.log" Apr 20 22:23:42.392956 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:42.392928 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-dxkxl" Apr 20 22:23:43.073294 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:43.073262 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-jkhkr_5bb83dec-e814-4768-b568-b10bb95213b7/openshift-lws-operator/0.log" Apr 20 22:23:47.397095 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:47.397055 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-b8cgm_d17e1fe4-c7f8-4b47-ae22-d0107c62522f/migrator/0.log" Apr 20 22:23:47.415233 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:47.415203 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-b8cgm_d17e1fe4-c7f8-4b47-ae22-d0107c62522f/graceful-termination/0.log" Apr 20 22:23:49.000479 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:49.000450 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9k89_b6c011cb-eacd-4c0a-88d4-f902e63941c3/kube-multus-additional-cni-plugins/0.log" Apr 20 22:23:49.021224 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:49.021197 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9k89_b6c011cb-eacd-4c0a-88d4-f902e63941c3/egress-router-binary-copy/0.log" Apr 20 22:23:49.044821 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:49.044793 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9k89_b6c011cb-eacd-4c0a-88d4-f902e63941c3/cni-plugins/0.log" Apr 20 22:23:49.063738 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:49.063715 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9k89_b6c011cb-eacd-4c0a-88d4-f902e63941c3/bond-cni-plugin/0.log" Apr 20 22:23:49.081264 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:49.081242 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9k89_b6c011cb-eacd-4c0a-88d4-f902e63941c3/routeoverride-cni/0.log" Apr 20 22:23:49.099253 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:49.099229 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9k89_b6c011cb-eacd-4c0a-88d4-f902e63941c3/whereabouts-cni-bincopy/0.log" Apr 20 22:23:49.124488 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:49.124465 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9k89_b6c011cb-eacd-4c0a-88d4-f902e63941c3/whereabouts-cni/0.log" Apr 20 22:23:49.329755 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:49.329679 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-swrdw_55307413-a629-4893-b816-dd674a0d602f/kube-multus/0.log" Apr 20 22:23:49.404725 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:49.404700 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j8c9k_86a4e942-ea9e-4978-b92e-c96688b972a3/network-metrics-daemon/0.log" Apr 20 22:23:49.424818 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:49.424789 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j8c9k_86a4e942-ea9e-4978-b92e-c96688b972a3/kube-rbac-proxy/0.log" Apr 20 22:23:50.240933 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:50.240905 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-controller/0.log" Apr 20 22:23:50.255628 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:50.255606 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/0.log" Apr 20 22:23:50.264889 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:50.264871 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovn-acl-logging/1.log" Apr 20 22:23:50.284311 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:50.284273 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/kube-rbac-proxy-node/0.log" Apr 20 22:23:50.302030 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:50.302011 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 22:23:50.317481 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:50.317461 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/northd/0.log" Apr 20 22:23:50.334261 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:50.334239 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/nbdb/0.log" Apr 20 22:23:50.351950 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:50.351890 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/sbdb/0.log" Apr 20 22:23:50.449413 ip-10-0-137-199 kubenswrapper[2566]: I0420 22:23:50.449388 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvkzb_000763ee-232e-428c-84f2-4ca88f559d17/ovnkube-controller/0.log"