Apr 20 13:30:43.288441 ip-10-0-142-144 systemd[1]: Starting Kubernetes Kubelet... Apr 20 13:30:43.744547 ip-10-0-142-144 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 13:30:43.744547 ip-10-0-142-144 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 13:30:43.744547 ip-10-0-142-144 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 13:30:43.744547 ip-10-0-142-144 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 13:30:43.744547 ip-10-0-142-144 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 13:30:43.746412 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.746312 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 13:30:43.749673 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749654 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 13:30:43.749673 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749669 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 13:30:43.749673 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749674 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 13:30:43.749673 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749679 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749683 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749688 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749693 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749697 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749701 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749704 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749708 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749712 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749716 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749720 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749724 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749728 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749731 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749735 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749747 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749751 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749755 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749760 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749763 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 13:30:43.749905 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749770 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749776 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749780 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749786 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749790 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749794 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749799 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749803 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749807 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749811 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749815 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749820 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749824 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749830 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749836 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749840 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749844 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749848 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749852 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749856 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 13:30:43.750752 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749860 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749864 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749868 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749873 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749878 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749882 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749886 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749891 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749897 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749902 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749906 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749910 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749914 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749918 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749923 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749927 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749930 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749936 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749941 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 13:30:43.751487 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749946 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749950 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749954 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749958 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749962 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749966 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749971 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749977 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749982 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749986 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749990 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749994 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.749999 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750003 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750007 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750014 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750019 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750024 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750029 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 13:30:43.751989 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750034 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750038 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750042 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750047 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750052 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750667 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750676 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750681 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750685 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750689 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750694 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750698 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750703 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750707 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750712 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750716 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750720 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750724 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750728 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750733 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 13:30:43.752491 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750737 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750742 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750746 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750750 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750754 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750758 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750762 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750769 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750774 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750779 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750783 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750788 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750792 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750796 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750800 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750805 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750810 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750815 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750819 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750823 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 13:30:43.753029 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750827 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750832 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750836 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750840 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750845 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750849 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750853 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750857 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750861 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750866 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750870 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750874 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750879 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750882 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750886 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750889 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750893 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750897 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750901 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750904 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 13:30:43.753892 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750909 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750913 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750917 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750922 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750926 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750931 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750935 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750940 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750945 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750949 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750953 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750956 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750960 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750965 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750969 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750973 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750977 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750981 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750986 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750990 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 13:30:43.754490 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750994 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.750999 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.751004 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.751009 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.751013 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.751018 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.751022 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.751026 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.751030 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.751034 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.751040 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.752909 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.752927 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.752938 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.752946 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.752952 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.752958 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.752965 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.752972 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.752977 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.752981 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 13:30:43.754981 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.752988 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.752994 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.752999 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753004 2573 flags.go:64] FLAG: --cgroup-root="" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753009 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753014 2573 flags.go:64] FLAG: --client-ca-file="" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753018 2573 flags.go:64] FLAG: --cloud-config="" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753023 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753027 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753033 2573 flags.go:64] FLAG: --cluster-domain="" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753038 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753043 2573 flags.go:64] FLAG: --config-dir="" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753047 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753053 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753060 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753065 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753071 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753076 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753081 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753086 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753090 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753096 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753101 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753108 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753113 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 13:30:43.755513 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753118 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753124 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753129 2573 flags.go:64] FLAG: --enable-server="true" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753133 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753249 2573 flags.go:64] FLAG: --event-burst="100" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753257 2573 flags.go:64] FLAG: --event-qps="50" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753263 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753269 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753285 2573 flags.go:64] FLAG: --eviction-hard="" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753293 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753299 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.753305 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754158 2573 flags.go:64] FLAG: --eviction-soft="" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754671 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754679 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754683 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754687 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754690 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754693 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754697 2573 flags.go:64] FLAG: --feature-gates="" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754702 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754705 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754709 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754713 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754716 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 20 13:30:43.756108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754720 2573 flags.go:64] FLAG: --help="false" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754723 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-142-144.ec2.internal" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754727 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754730 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754733 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754736 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754740 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754743 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754746 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754749 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754752 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754755 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754758 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754761 2573 flags.go:64] FLAG: --kube-reserved="" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754764 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754767 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754771 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754774 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754777 2573 flags.go:64] FLAG: --lock-file="" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754780 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754783 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754787 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754804 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 13:30:43.756738 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754807 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754810 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754813 2573 flags.go:64] FLAG: --logging-format="text" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754816 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754820 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754822 2573 flags.go:64] FLAG: --manifest-url="" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754825 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754830 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754833 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754838 2573 flags.go:64] FLAG: --max-pods="110" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754841 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754844 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754847 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754850 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754854 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754857 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754860 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754870 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754874 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754877 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754880 2573 flags.go:64] FLAG: --pod-cidr="" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754883 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754889 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754892 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 13:30:43.757321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754895 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754898 2573 flags.go:64] FLAG: --port="10250" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754901 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754904 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0620a22f2cea3d2ec" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754907 2573 flags.go:64] FLAG: --qos-reserved="" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754910 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754914 2573 flags.go:64] FLAG: --register-node="true" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754917 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754920 2573 flags.go:64] FLAG: --register-with-taints="" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754923 2573 flags.go:64] FLAG: --registry-burst="10" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754926 2573 flags.go:64] FLAG: --registry-qps="5" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754929 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754932 2573 flags.go:64] FLAG: --reserved-memory="" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754936 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754939 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754942 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754945 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754948 2573 flags.go:64] FLAG: --runonce="false" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754950 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754953 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754956 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754959 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754962 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754965 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754968 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754975 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 13:30:43.757901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754977 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754980 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754983 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754986 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754989 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754992 2573 flags.go:64] FLAG: --system-cgroups="" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.754995 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.755001 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.755004 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.755007 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.755012 2573 flags.go:64] FLAG: --tls-min-version="" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.755015 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.755018 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.755021 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.755024 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.755027 2573 flags.go:64] FLAG: --v="2" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.755032 2573 flags.go:64] FLAG: --version="false" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.755036 2573 flags.go:64] FLAG: --vmodule="" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.755041 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.755044 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755158 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755162 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755175 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755178 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 13:30:43.758537 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755181 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755185 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755188 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755191 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755193 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755196 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755199 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755201 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755205 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755208 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755210 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755214 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755218 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755221 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755224 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755227 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755230 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755232 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755235 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 13:30:43.759121 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755238 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755240 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755243 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755246 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755248 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755251 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755254 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755258 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755261 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755264 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755266 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755269 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755272 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755275 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755277 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755280 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755283 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755285 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755288 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755290 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 13:30:43.759618 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755293 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755297 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755299 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755302 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755305 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755307 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755310 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755312 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755315 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755317 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755320 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755323 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755325 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755328 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755331 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755334 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755338 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755341 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755344 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 13:30:43.760103 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755347 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755350 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755352 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755355 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755358 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755360 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755363 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755365 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755368 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755370 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755372 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755375 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755377 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755380 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755384 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755386 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755389 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755391 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755394 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755397 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 13:30:43.760591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755399 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 13:30:43.761079 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755402 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 13:30:43.761079 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755404 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 13:30:43.761079 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.755407 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 13:30:43.761079 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.756159 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 13:30:43.762521 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.762505 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 13:30:43.762521 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.762522 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 13:30:43.762591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762569 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 13:30:43.762591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762574 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 13:30:43.762591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762577 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 13:30:43.762591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762580 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 13:30:43.762591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762583 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 13:30:43.762591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762586 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 13:30:43.762591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762589 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 13:30:43.762591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762592 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 13:30:43.762591 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762595 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762598 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762601 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762603 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762606 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762609 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762611 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762614 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762617 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762621 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762625 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762627 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762630 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762632 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762635 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762638 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762640 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762643 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762645 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 13:30:43.762804 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762648 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762651 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762653 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762656 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762658 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762661 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762663 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762666 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762669 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762672 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762674 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762677 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762681 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762684 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762687 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762690 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762693 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762695 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762698 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762700 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 13:30:43.763294 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762703 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762705 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762708 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762711 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762713 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762715 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762718 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762721 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762725 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762729 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762732 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762735 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762739 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762741 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762744 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762747 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762749 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762752 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762754 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762757 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 13:30:43.763775 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762759 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762762 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762765 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762768 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762770 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762774 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762777 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762779 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762782 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762784 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762787 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762789 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762792 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762794 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762797 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762799 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762802 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762804 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 13:30:43.764263 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762807 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.762812 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762913 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762918 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762921 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762924 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762926 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762929 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762932 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762934 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762937 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762939 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762942 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762944 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762947 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762949 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 13:30:43.764747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762952 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762954 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762957 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762959 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762963 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762965 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762968 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762971 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762973 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762976 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762978 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762981 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762983 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762985 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762988 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762990 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762993 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762995 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.762998 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763000 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 13:30:43.765154 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763003 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763005 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763008 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763010 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763013 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763015 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763018 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763020 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763023 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763025 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763029 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763033 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763035 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763038 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763040 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763043 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763045 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763050 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763053 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 13:30:43.765640 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763056 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763058 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763061 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763064 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763066 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763069 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763071 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763074 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763076 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763079 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763081 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763083 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763086 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763089 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763091 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763094 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763097 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763099 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763102 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763104 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 13:30:43.766193 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763107 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 13:30:43.766687 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763109 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 13:30:43.766687 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763112 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 13:30:43.766687 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763114 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 13:30:43.766687 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763117 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 13:30:43.766687 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763119 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 13:30:43.766687 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763122 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 13:30:43.766687 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763124 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 13:30:43.766687 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763127 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 13:30:43.766687 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763129 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 13:30:43.766687 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763132 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 13:30:43.766687 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763135 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 13:30:43.766687 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:43.763154 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 13:30:43.766687 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.763160 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 13:30:43.766687 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.763841 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 13:30:43.766687 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.765722 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 13:30:43.767052 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.767032 2573 server.go:1019] "Starting client certificate rotation" Apr 20 13:30:43.767155 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.767126 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 13:30:43.767197 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.767177 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 13:30:43.796512 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.796491 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 13:30:43.799469 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.799452 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 13:30:43.817949 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.817917 2573 log.go:25] "Validated CRI v1 runtime API" Apr 20 13:30:43.824328 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.824308 2573 log.go:25] "Validated CRI v1 image API" Apr 20 13:30:43.825671 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.825652 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 13:30:43.829232 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.829213 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 13:30:43.830938 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.830918 2573 fs.go:135] Filesystem UUIDs: map[14014aba-6505-4aa6-a082-5ed676584401:/dev/nvme0n1p3 49893af5-41f0-4bf0-ae1f-3ceb573f0a90:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 20 13:30:43.830987 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.830938 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 13:30:43.837079 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.836966 2573 manager.go:217] Machine: {Timestamp:2026-04-20 13:30:43.834656282 +0000 UTC m=+0.430347348 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099957 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2703430ef41aa440d510c8e1dadb19 SystemUUID:ec270343-0ef4-1aa4-40d5-10c8e1dadb19 BootID:2b515f42-292d-4d38-a2ce-60774040470f Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6d:aa:3a:b8:67 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6d:aa:3a:b8:67 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0a:5c:df:b9:14:96 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 13:30:43.837079 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.837074 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 13:30:43.837199 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.837173 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 13:30:43.838283 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.838262 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 13:30:43.838430 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.838285 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-144.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 13:30:43.838476 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.838440 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 13:30:43.838476 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.838448 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 13:30:43.838476 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.838465 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 13:30:43.839422 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.839412 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 13:30:43.840954 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.840944 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 20 13:30:43.841239 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.841229 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 13:30:43.843731 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.843720 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 20 13:30:43.843763 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.843741 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 13:30:43.843763 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.843753 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 13:30:43.843812 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.843767 2573 kubelet.go:397] "Adding apiserver pod source" Apr 20 13:30:43.843812 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.843777 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 13:30:43.845240 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.845225 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 13:30:43.845292 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.845252 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 13:30:43.849326 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.849309 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 13:30:43.850712 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.850699 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 13:30:43.852336 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.852314 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 13:30:43.852336 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.852335 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 13:30:43.852538 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.852343 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 13:30:43.852538 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.852351 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 13:30:43.852538 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.852360 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 13:30:43.852538 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.852368 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 13:30:43.852538 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.852376 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 13:30:43.852538 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.852384 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 13:30:43.852538 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.852396 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 13:30:43.852538 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.852404 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 13:30:43.852538 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.852418 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 13:30:43.852538 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.852431 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 13:30:43.854279 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.854264 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 13:30:43.854279 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.854281 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 13:30:43.855036 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:43.855013 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 13:30:43.855195 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:43.855176 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 13:30:43.857758 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.857742 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 13:30:43.857833 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.857786 2573 server.go:1295] "Started kubelet" Apr 20 13:30:43.857892 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.857861 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 13:30:43.857975 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.857927 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 13:30:43.858019 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.857992 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 13:30:43.858697 ip-10-0-142-144 systemd[1]: Started Kubernetes Kubelet. Apr 20 13:30:43.859326 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.859313 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 13:30:43.865176 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.865157 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 20 13:30:43.869613 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.869438 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 13:30:43.870059 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.870032 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 13:30:43.870224 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:43.870193 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 13:30:43.870823 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.870640 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 13:30:43.871078 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.871059 2573 factory.go:55] Registering systemd factory Apr 20 13:30:43.871078 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.871076 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 13:30:43.871226 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.871091 2573 factory.go:223] Registration of the systemd container factory successfully Apr 20 13:30:43.871226 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:43.871062 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-144.ec2.internal\" not found" Apr 20 13:30:43.871226 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.871193 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 13:30:43.871226 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.871212 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 13:30:43.871403 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.871303 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 20 13:30:43.871403 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.871313 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 20 13:30:43.871403 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.871339 2573 factory.go:153] Registering CRI-O factory Apr 20 13:30:43.871403 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.871351 2573 factory.go:223] Registration of the crio container factory successfully Apr 20 13:30:43.871403 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:43.870383 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-144.ec2.internal.18a813d19479adc1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-144.ec2.internal,UID:ip-10-0-142-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-144.ec2.internal,},FirstTimestamp:2026-04-20 13:30:43.857755585 +0000 UTC m=+0.453446653,LastTimestamp:2026-04-20 13:30:43.857755585 +0000 UTC m=+0.453446653,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-144.ec2.internal,}" Apr 20 13:30:43.871643 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.871421 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 13:30:43.871643 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.871447 2573 factory.go:103] Registering Raw factory Apr 20 13:30:43.871643 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.871461 2573 manager.go:1196] Started watching for new ooms in manager Apr 20 13:30:43.871871 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.871858 2573 manager.go:319] Starting recovery of all containers Apr 20 13:30:43.876259 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.876238 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-95f27" Apr 20 13:30:43.879109 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:43.878944 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 13:30:43.879221 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:43.878997 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 13:30:43.881718 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.881703 2573 manager.go:324] Recovery completed Apr 20 13:30:43.884387 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.884370 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-95f27" Apr 20 13:30:43.886024 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.886012 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:30:43.888393 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.888379 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:30:43.888441 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.888407 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:30:43.888441 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.888427 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:30:43.888872 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.888859 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 13:30:43.888872 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.888869 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 13:30:43.888947 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.888883 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 20 13:30:43.890346 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:43.890282 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-144.ec2.internal.18a813d1964d2b12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-144.ec2.internal,UID:ip-10-0-142-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-142-144.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-142-144.ec2.internal,},FirstTimestamp:2026-04-20 13:30:43.888392978 +0000 UTC m=+0.484084046,LastTimestamp:2026-04-20 13:30:43.888392978 +0000 UTC m=+0.484084046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-144.ec2.internal,}" Apr 20 13:30:43.891032 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.891021 2573 policy_none.go:49] "None policy: Start" Apr 20 13:30:43.891073 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.891036 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 13:30:43.891073 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.891046 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 20 13:30:43.930735 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.930720 2573 manager.go:341] "Starting Device Plugin manager" Apr 20 13:30:43.944899 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:43.930758 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 13:30:43.944899 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.930770 2573 server.go:85] "Starting device plugin registration server" Apr 20 13:30:43.944899 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.930999 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 13:30:43.944899 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.931011 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 13:30:43.944899 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.931099 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 13:30:43.944899 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.931189 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 13:30:43.944899 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.931198 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 13:30:43.944899 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:43.931822 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 13:30:43.944899 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:43.931849 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-144.ec2.internal\" not found" Apr 20 13:30:43.973934 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.973909 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 13:30:43.975019 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.975000 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 13:30:43.975019 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.975022 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 13:30:43.975173 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.975038 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 13:30:43.975173 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.975045 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 13:30:43.975173 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:43.975079 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 13:30:43.978388 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:43.978368 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:30:44.031644 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.031587 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:30:44.032412 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.032398 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:30:44.032479 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.032424 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:30:44.032479 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.032436 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:30:44.032479 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.032458 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.038966 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.038953 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.039011 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:44.038973 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-144.ec2.internal\": node \"ip-10-0-142-144.ec2.internal\" not found" Apr 20 13:30:44.054776 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:44.054754 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-144.ec2.internal\" not found" Apr 20 13:30:44.076174 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.076137 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-144.ec2.internal"] Apr 20 13:30:44.076231 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.076224 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:30:44.076950 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.076936 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:30:44.077012 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.076962 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:30:44.077012 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.076972 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:30:44.078201 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.078187 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:30:44.078294 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.078279 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.078353 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.078317 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:30:44.078807 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.078793 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:30:44.078807 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.078801 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:30:44.078910 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.078819 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:30:44.078910 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.078821 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:30:44.078910 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.078850 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:30:44.078910 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.078834 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:30:44.079963 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.079947 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.080011 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.079973 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:30:44.080642 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.080628 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:30:44.080718 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.080655 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:30:44.080718 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.080670 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:30:44.114042 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:44.114015 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-144.ec2.internal\" not found" node="ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.118591 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:44.118574 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-144.ec2.internal\" not found" node="ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.155299 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:44.155272 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-144.ec2.internal\" not found" Apr 20 13:30:44.173233 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.173209 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/37832f134222998e163934b3a7a5f97c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal\" (UID: \"37832f134222998e163934b3a7a5f97c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.173303 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.173240 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1823e330d15f0fe92f9823b1c0261d30-config\") pod \"kube-apiserver-proxy-ip-10-0-142-144.ec2.internal\" (UID: \"1823e330d15f0fe92f9823b1c0261d30\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.173303 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.173258 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/37832f134222998e163934b3a7a5f97c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal\" (UID: \"37832f134222998e163934b3a7a5f97c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.256348 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:44.256317 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-144.ec2.internal\" not found" Apr 20 13:30:44.273739 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.273720 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/37832f134222998e163934b3a7a5f97c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal\" (UID: \"37832f134222998e163934b3a7a5f97c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.273798 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.273747 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/37832f134222998e163934b3a7a5f97c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal\" (UID: \"37832f134222998e163934b3a7a5f97c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.273798 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.273764 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1823e330d15f0fe92f9823b1c0261d30-config\") pod \"kube-apiserver-proxy-ip-10-0-142-144.ec2.internal\" (UID: \"1823e330d15f0fe92f9823b1c0261d30\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.273893 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.273802 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1823e330d15f0fe92f9823b1c0261d30-config\") pod \"kube-apiserver-proxy-ip-10-0-142-144.ec2.internal\" (UID: \"1823e330d15f0fe92f9823b1c0261d30\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.273893 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.273830 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/37832f134222998e163934b3a7a5f97c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal\" (UID: \"37832f134222998e163934b3a7a5f97c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.273893 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.273848 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/37832f134222998e163934b3a7a5f97c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal\" (UID: \"37832f134222998e163934b3a7a5f97c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.357212 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:44.357179 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-144.ec2.internal\" not found" Apr 20 13:30:44.417712 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.417679 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.421168 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.421134 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-144.ec2.internal" Apr 20 13:30:44.457902 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:44.457869 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-144.ec2.internal\" not found" Apr 20 13:30:44.558453 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:44.558422 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-144.ec2.internal\" not found" Apr 20 13:30:44.658980 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:44.658907 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-144.ec2.internal\" not found" Apr 20 13:30:44.759532 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:44.759501 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-144.ec2.internal\" not found" Apr 20 13:30:44.766664 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.766642 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 13:30:44.766794 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.766779 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 13:30:44.859620 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:44.859589 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-144.ec2.internal\" not found" Apr 20 13:30:44.869759 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.869736 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 13:30:44.886361 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.886324 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 13:25:43 +0000 UTC" deadline="2028-01-19 14:02:40.028902582 +0000 UTC" Apr 20 13:30:44.886361 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.886359 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15336h31m55.142547435s" Apr 20 13:30:44.890368 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.890347 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 13:30:44.915996 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:44.915936 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1823e330d15f0fe92f9823b1c0261d30.slice/crio-3a5594ed1e741c75342053b2f5daeb2ab19a051f81542c29f9b225f3031d0de2 WatchSource:0}: Error finding container 3a5594ed1e741c75342053b2f5daeb2ab19a051f81542c29f9b225f3031d0de2: Status 404 returned error can't find the container with id 3a5594ed1e741c75342053b2f5daeb2ab19a051f81542c29f9b225f3031d0de2 Apr 20 13:30:44.916525 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:44.916495 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37832f134222998e163934b3a7a5f97c.slice/crio-8811d210bf4fc20b192e265425715ec6a248a5d4ac2190f3c756d5ed1bcb3b88 WatchSource:0}: Error finding container 8811d210bf4fc20b192e265425715ec6a248a5d4ac2190f3c756d5ed1bcb3b88: Status 404 returned error can't find the container with id 8811d210bf4fc20b192e265425715ec6a248a5d4ac2190f3c756d5ed1bcb3b88 Apr 20 13:30:44.919408 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.919380 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fp8dp" Apr 20 13:30:44.920186 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.920167 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 13:30:44.927724 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.927706 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fp8dp" Apr 20 13:30:44.959963 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:44.959937 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-144.ec2.internal\" not found" Apr 20 13:30:44.978217 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.978173 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-144.ec2.internal" event={"ID":"1823e330d15f0fe92f9823b1c0261d30","Type":"ContainerStarted","Data":"3a5594ed1e741c75342053b2f5daeb2ab19a051f81542c29f9b225f3031d0de2"} Apr 20 13:30:44.979096 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:44.979078 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal" event={"ID":"37832f134222998e163934b3a7a5f97c","Type":"ContainerStarted","Data":"8811d210bf4fc20b192e265425715ec6a248a5d4ac2190f3c756d5ed1bcb3b88"} Apr 20 13:30:45.060305 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:45.060277 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-144.ec2.internal\" not found" Apr 20 13:30:45.069070 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.069049 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:30:45.118214 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.118185 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:30:45.171204 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.171124 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal" Apr 20 13:30:45.184659 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.184633 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 13:30:45.186515 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.186497 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-144.ec2.internal" Apr 20 13:30:45.198365 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.198346 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 13:30:45.225688 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.225668 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:30:45.845477 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.845447 2573 apiserver.go:52] "Watching apiserver" Apr 20 13:30:45.855897 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.855870 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 13:30:45.857091 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.857065 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-4qdsh","openshift-multus/multus-additional-cni-plugins-g88jt","openshift-network-diagnostics/network-check-target-sb687","kube-system/konnectivity-agent-tm5fj","openshift-cluster-node-tuning-operator/tuned-gp6dp","openshift-image-registry/node-ca-kmm7p","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal","openshift-multus/multus-2s69b","openshift-multus/network-metrics-daemon-5w9cl","openshift-network-operator/iptables-alerter-sgbrl","openshift-ovn-kubernetes/ovnkube-node-drksq","kube-system/global-pull-secret-syncer-5w2mr","kube-system/kube-apiserver-proxy-ip-10-0-142-144.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89"] Apr 20 13:30:45.858509 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.858485 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sgbrl" Apr 20 13:30:45.859813 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.859793 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.860969 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.860949 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 13:30:45.861072 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.861017 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 13:30:45.861072 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.861053 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-z5xx8\"" Apr 20 13:30:45.861271 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.861255 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 13:30:45.861916 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.861895 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-phdfg\"" Apr 20 13:30:45.862158 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.862127 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:30:45.862242 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.862187 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 13:30:45.862304 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:45.862248 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:30:45.863133 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.863115 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 13:30:45.863324 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.863306 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 13:30:45.863324 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.863322 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 13:30:45.863485 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.863470 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tm5fj" Apr 20 13:30:45.863552 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.863517 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 13:30:45.863699 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.863679 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 13:30:45.864515 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.864450 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.864609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.864562 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kmm7p" Apr 20 13:30:45.865405 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.865386 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 13:30:45.865558 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.865541 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 13:30:45.865630 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.865609 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gdjnq\"" Apr 20 13:30:45.865815 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.865711 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:45.865891 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:45.865806 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:30:45.867700 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.866477 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 13:30:45.867700 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.866604 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 13:30:45.867700 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.867037 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 13:30:45.867700 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.867092 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 13:30:45.867700 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.867125 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.867700 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.867252 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-54ggv\"" Apr 20 13:30:45.867700 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.867444 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 13:30:45.867700 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.867660 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9q7gl\"" Apr 20 13:30:45.869588 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.869556 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 13:30:45.869808 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.869787 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 13:30:45.869986 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.869967 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4qdsh" Apr 20 13:30:45.870057 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.870029 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 13:30:45.870302 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.870281 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kmvbg\"" Apr 20 13:30:45.871502 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.871477 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.872023 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.872006 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 13:30:45.872442 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.872421 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 13:30:45.872557 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.872423 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-d66cx\"" Apr 20 13:30:45.873078 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.873061 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:45.873176 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.873096 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.873176 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:45.873165 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:30:45.873506 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.873403 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 13:30:45.873506 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.873466 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gchtz\"" Apr 20 13:30:45.873506 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.873471 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 13:30:45.874373 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.873857 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 13:30:45.874373 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.874135 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 13:30:45.874373 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.874320 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 13:30:45.875036 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.875006 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 13:30:45.875127 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.875092 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mdmhs\"" Apr 20 13:30:45.882386 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.882363 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48jpt\" (UniqueName: \"kubernetes.io/projected/75eb0979-9a66-40d2-a063-6f592c87a4f1-kube-api-access-48jpt\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.882539 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.882505 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5e473fb6-5d6c-47e5-9f17-d87b134e316e-serviceca\") pod \"node-ca-kmm7p\" (UID: \"5e473fb6-5d6c-47e5-9f17-d87b134e316e\") " pod="openshift-image-registry/node-ca-kmm7p" Apr 20 13:30:45.882616 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.882582 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-sys-fs\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.882692 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.882669 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-run-systemd\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.883152 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883116 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/83d0b332-6e36-4e2e-8231-501955bcf71b-iptables-alerter-script\") pod \"iptables-alerter-sgbrl\" (UID: \"83d0b332-6e36-4e2e-8231-501955bcf71b\") " pod="openshift-network-operator/iptables-alerter-sgbrl" Apr 20 13:30:45.883224 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883168 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-systemd-units\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.883224 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883191 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-tuned\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.883224 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/75eb0979-9a66-40d2-a063-6f592c87a4f1-tmp\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.883386 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883230 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/41710337-4f82-4bb8-abe7-f7a5cc3d9802-hosts-file\") pod \"node-resolver-4qdsh\" (UID: \"41710337-4f82-4bb8-abe7-f7a5cc3d9802\") " pod="openshift-dns/node-resolver-4qdsh" Apr 20 13:30:45.883386 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883250 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a7673b95-6e38-4e6e-84a1-c083cd4e6356-agent-certs\") pod \"konnectivity-agent-tm5fj\" (UID: \"a7673b95-6e38-4e6e-84a1-c083cd4e6356\") " pod="kube-system/konnectivity-agent-tm5fj" Apr 20 13:30:45.883386 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883270 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a7673b95-6e38-4e6e-84a1-c083cd4e6356-konnectivity-ca\") pod \"konnectivity-agent-tm5fj\" (UID: \"a7673b95-6e38-4e6e-84a1-c083cd4e6356\") " pod="kube-system/konnectivity-agent-tm5fj" Apr 20 13:30:45.883386 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883289 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.883386 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883311 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-etc-selinux\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.883386 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883334 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-slash\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.883386 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883355 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-hostroot\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.883714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883402 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-etc-kubernetes\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.883714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883438 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-run-ovn\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.883714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883466 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/932d1d43-95d3-476c-b3d2-da80b4fcf711-ovn-node-metrics-cert\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.883714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883509 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-var-lib-kubelet\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.883714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883547 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/29c60f5b-f12d-43ec-a794-f2abbe748308-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.883714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883579 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7gpw\" (UniqueName: \"kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw\") pod \"network-check-target-sb687\" (UID: \"ff4d6258-b35d-4f25-b171-7ce6265db518\") " pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:30:45.883714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-sysctl-conf\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.883714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883629 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-registration-dir\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.883714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883654 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-run-netns\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.883714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883679 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-log-socket\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.883714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883703 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/932d1d43-95d3-476c-b3d2-da80b4fcf711-env-overrides\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883737 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xfcj\" (UniqueName: \"kubernetes.io/projected/932d1d43-95d3-476c-b3d2-da80b4fcf711-kube-api-access-2xfcj\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883760 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-multus-cni-dir\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883781 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-cnibin\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883802 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-os-release\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883823 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-run-k8s-cni-cncf-io\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883846 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-var-lib-cni-multus\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883869 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-run-netns\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883890 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-etc-openvswitch\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883912 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-run-ovn-kubernetes\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883935 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/932d1d43-95d3-476c-b3d2-da80b4fcf711-ovnkube-config\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883958 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/29c60f5b-f12d-43ec-a794-f2abbe748308-cnibin\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.883981 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/29c60f5b-f12d-43ec-a794-f2abbe748308-os-release\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884004 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d66n\" (UniqueName: \"kubernetes.io/projected/41710337-4f82-4bb8-abe7-f7a5cc3d9802-kube-api-access-9d66n\") pod \"node-resolver-4qdsh\" (UID: \"41710337-4f82-4bb8-abe7-f7a5cc3d9802\") " pod="openshift-dns/node-resolver-4qdsh" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884026 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-run-openvswitch\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884047 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-cni-bin\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.884193 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884071 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvcvg\" (UniqueName: \"kubernetes.io/projected/c9955ecd-fee7-409f-b733-5e9973245030-kube-api-access-vvcvg\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884093 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-system-cni-dir\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884116 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-var-lib-kubelet\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884154 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6f769d40-1c0a-4957-8061-892b0f5e5266-multus-daemon-config\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884177 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-run-multus-certs\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884199 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-var-lib-openvswitch\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884227 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-sysctl-d\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884245 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-systemd\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884288 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-sys\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884344 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e473fb6-5d6c-47e5-9f17-d87b134e316e-host\") pod \"node-ca-kmm7p\" (UID: \"5e473fb6-5d6c-47e5-9f17-d87b134e316e\") " pod="openshift-image-registry/node-ca-kmm7p" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884381 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74dtq\" (UniqueName: \"kubernetes.io/projected/29c60f5b-f12d-43ec-a794-f2abbe748308-kube-api-access-74dtq\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884410 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/41710337-4f82-4bb8-abe7-f7a5cc3d9802-tmp-dir\") pod \"node-resolver-4qdsh\" (UID: \"41710337-4f82-4bb8-abe7-f7a5cc3d9802\") " pod="openshift-dns/node-resolver-4qdsh" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884436 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-multus-socket-dir-parent\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884460 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-kubelet\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884483 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-var-lib-cni-bin\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884515 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.884936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884567 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-sysconfig\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884610 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn4xw\" (UniqueName: \"kubernetes.io/projected/5e473fb6-5d6c-47e5-9f17-d87b134e316e-kube-api-access-kn4xw\") pod \"node-ca-kmm7p\" (UID: \"5e473fb6-5d6c-47e5-9f17-d87b134e316e\") " pod="openshift-image-registry/node-ca-kmm7p" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884636 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29c60f5b-f12d-43ec-a794-f2abbe748308-system-cni-dir\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884659 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/29c60f5b-f12d-43ec-a794-f2abbe748308-cni-binary-copy\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884679 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-node-log\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884701 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-modprobe-d\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884721 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-run\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884742 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/932d1d43-95d3-476c-b3d2-da80b4fcf711-ovnkube-script-lib\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884763 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-host\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884786 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l6hx\" (UniqueName: \"kubernetes.io/projected/0de99a89-e8e5-491a-90c3-5c371ed6705f-kube-api-access-6l6hx\") pod \"network-metrics-daemon-5w9cl\" (UID: \"0de99a89-e8e5-491a-90c3-5c371ed6705f\") " pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884810 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/83d0b332-6e36-4e2e-8231-501955bcf71b-host-slash\") pod \"iptables-alerter-sgbrl\" (UID: \"83d0b332-6e36-4e2e-8231-501955bcf71b\") " pod="openshift-network-operator/iptables-alerter-sgbrl" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884832 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-socket-dir\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884856 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-device-dir\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884882 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78xkm\" (UniqueName: \"kubernetes.io/projected/83d0b332-6e36-4e2e-8231-501955bcf71b-kube-api-access-78xkm\") pod \"iptables-alerter-sgbrl\" (UID: \"83d0b332-6e36-4e2e-8231-501955bcf71b\") " pod="openshift-network-operator/iptables-alerter-sgbrl" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884907 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-cni-netd\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884929 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-kubernetes\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.885705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.884953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-lib-modules\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.886466 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.885003 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/29c60f5b-f12d-43ec-a794-f2abbe748308-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.886466 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.885029 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-multus-conf-dir\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.886466 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.885054 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bbz7\" (UniqueName: \"kubernetes.io/projected/6f769d40-1c0a-4957-8061-892b0f5e5266-kube-api-access-6bbz7\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.886466 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.885076 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs\") pod \"network-metrics-daemon-5w9cl\" (UID: \"0de99a89-e8e5-491a-90c3-5c371ed6705f\") " pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:45.886466 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.885102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/29c60f5b-f12d-43ec-a794-f2abbe748308-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.886466 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.885157 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f769d40-1c0a-4957-8061-892b0f5e5266-cni-binary-copy\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.928820 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.928787 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 13:25:44 +0000 UTC" deadline="2028-01-21 08:52:03.52391909 +0000 UTC" Apr 20 13:30:45.928820 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.928819 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15379h21m17.595103141s" Apr 20 13:30:45.972157 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.972122 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 13:30:45.972936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.972914 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:30:45.986239 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-sysctl-conf\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.986344 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-registration-dir\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.986344 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-run-netns\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.986344 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986299 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-log-socket\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.986344 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986322 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/932d1d43-95d3-476c-b3d2-da80b4fcf711-env-overrides\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.986537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986348 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xfcj\" (UniqueName: \"kubernetes.io/projected/932d1d43-95d3-476c-b3d2-da80b4fcf711-kube-api-access-2xfcj\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.986537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986375 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-multus-cni-dir\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.986537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986390 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-cnibin\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.986537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-run-netns\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.986537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986405 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-os-release\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.986537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986409 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-log-socket\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.986537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986349 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-registration-dir\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.986537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986420 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-run-k8s-cni-cncf-io\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.986537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986447 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-run-k8s-cni-cncf-io\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.986537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986465 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-var-lib-cni-multus\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.986537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986520 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-multus-cni-dir\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.986537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986520 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-run-netns\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986574 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-os-release\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986574 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-etc-openvswitch\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986614 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-cnibin\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986618 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-run-ovn-kubernetes\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986490 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-sysctl-conf\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-run-netns\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986670 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-etc-openvswitch\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986686 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/932d1d43-95d3-476c-b3d2-da80b4fcf711-ovnkube-config\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986710 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-run-ovn-kubernetes\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/29c60f5b-f12d-43ec-a794-f2abbe748308-cnibin\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986789 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/29c60f5b-f12d-43ec-a794-f2abbe748308-os-release\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986873 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/29c60f5b-f12d-43ec-a794-f2abbe748308-cnibin\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986907 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9d66n\" (UniqueName: \"kubernetes.io/projected/41710337-4f82-4bb8-abe7-f7a5cc3d9802-kube-api-access-9d66n\") pod \"node-resolver-4qdsh\" (UID: \"41710337-4f82-4bb8-abe7-f7a5cc3d9802\") " pod="openshift-dns/node-resolver-4qdsh" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986908 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/29c60f5b-f12d-43ec-a794-f2abbe748308-os-release\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986930 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-var-lib-cni-multus\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986947 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-run-openvswitch\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.987080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986979 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-run-openvswitch\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.986982 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-cni-bin\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987016 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvcvg\" (UniqueName: \"kubernetes.io/projected/c9955ecd-fee7-409f-b733-5e9973245030-kube-api-access-vvcvg\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987225 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-cni-bin\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987254 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-system-cni-dir\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987291 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-system-cni-dir\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987304 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-var-lib-kubelet\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987361 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6f769d40-1c0a-4957-8061-892b0f5e5266-multus-daemon-config\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987384 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-var-lib-kubelet\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-run-multus-certs\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987426 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-var-lib-openvswitch\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987449 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-sysctl-d\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987471 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-systemd\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987494 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-sys\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987513 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e473fb6-5d6c-47e5-9f17-d87b134e316e-host\") pod \"node-ca-kmm7p\" (UID: \"5e473fb6-5d6c-47e5-9f17-d87b134e316e\") " pod="openshift-image-registry/node-ca-kmm7p" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987540 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-run-multus-certs\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987537 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74dtq\" (UniqueName: \"kubernetes.io/projected/29c60f5b-f12d-43ec-a794-f2abbe748308-kube-api-access-74dtq\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/41710337-4f82-4bb8-abe7-f7a5cc3d9802-tmp-dir\") pod \"node-resolver-4qdsh\" (UID: \"41710337-4f82-4bb8-abe7-f7a5cc3d9802\") " pod="openshift-dns/node-resolver-4qdsh" Apr 20 13:30:45.987803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987607 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-systemd\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987632 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-multus-socket-dir-parent\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987660 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-var-lib-openvswitch\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987701 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e473fb6-5d6c-47e5-9f17-d87b134e316e-host\") pod \"node-ca-kmm7p\" (UID: \"5e473fb6-5d6c-47e5-9f17-d87b134e316e\") " pod="openshift-image-registry/node-ca-kmm7p" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987765 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-sys\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987799 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-sysctl-d\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987662 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-kubelet\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987819 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-multus-socket-dir-parent\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987835 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/932d1d43-95d3-476c-b3d2-da80b4fcf711-env-overrides\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987854 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-kubelet\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987860 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-var-lib-cni-bin\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.987840 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-host-var-lib-cni-bin\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988043 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6f769d40-1c0a-4957-8061-892b0f5e5266-multus-daemon-config\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988042 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988100 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/41710337-4f82-4bb8-abe7-f7a5cc3d9802-tmp-dir\") pod \"node-resolver-4qdsh\" (UID: \"41710337-4f82-4bb8-abe7-f7a5cc3d9802\") " pod="openshift-dns/node-resolver-4qdsh" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988120 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-sysconfig\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988122 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988185 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kn4xw\" (UniqueName: \"kubernetes.io/projected/5e473fb6-5d6c-47e5-9f17-d87b134e316e-kube-api-access-kn4xw\") pod \"node-ca-kmm7p\" (UID: \"5e473fb6-5d6c-47e5-9f17-d87b134e316e\") " pod="openshift-image-registry/node-ca-kmm7p" Apr 20 13:30:45.988609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988201 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-sysconfig\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988218 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29c60f5b-f12d-43ec-a794-f2abbe748308-system-cni-dir\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988208 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/932d1d43-95d3-476c-b3d2-da80b4fcf711-ovnkube-config\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988245 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/29c60f5b-f12d-43ec-a794-f2abbe748308-cni-binary-copy\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-node-log\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988304 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-modprobe-d\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988264 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29c60f5b-f12d-43ec-a794-f2abbe748308-system-cni-dir\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988327 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-run\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988372 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-node-log\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988382 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3b76b883-7171-4efb-b2ac-fc558e9fdf79-kubelet-config\") pod \"global-pull-secret-syncer-5w2mr\" (UID: \"3b76b883-7171-4efb-b2ac-fc558e9fdf79\") " pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3b76b883-7171-4efb-b2ac-fc558e9fdf79-dbus\") pod \"global-pull-secret-syncer-5w2mr\" (UID: \"3b76b883-7171-4efb-b2ac-fc558e9fdf79\") " pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988454 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/932d1d43-95d3-476c-b3d2-da80b4fcf711-ovnkube-script-lib\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988482 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-modprobe-d\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-host\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988502 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-run\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l6hx\" (UniqueName: \"kubernetes.io/projected/0de99a89-e8e5-491a-90c3-5c371ed6705f-kube-api-access-6l6hx\") pod \"network-metrics-daemon-5w9cl\" (UID: \"0de99a89-e8e5-491a-90c3-5c371ed6705f\") " pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988676 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/83d0b332-6e36-4e2e-8231-501955bcf71b-host-slash\") pod \"iptables-alerter-sgbrl\" (UID: \"83d0b332-6e36-4e2e-8231-501955bcf71b\") " pod="openshift-network-operator/iptables-alerter-sgbrl" Apr 20 13:30:45.989417 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988794 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/83d0b332-6e36-4e2e-8231-501955bcf71b-host-slash\") pod \"iptables-alerter-sgbrl\" (UID: \"83d0b332-6e36-4e2e-8231-501955bcf71b\") " pod="openshift-network-operator/iptables-alerter-sgbrl" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988851 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-socket-dir\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988883 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/29c60f5b-f12d-43ec-a794-f2abbe748308-cni-binary-copy\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988896 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-device-dir\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988951 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret\") pod \"global-pull-secret-syncer-5w2mr\" (UID: \"3b76b883-7171-4efb-b2ac-fc558e9fdf79\") " pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.988954 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-device-dir\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989056 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-socket-dir\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989059 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78xkm\" (UniqueName: \"kubernetes.io/projected/83d0b332-6e36-4e2e-8231-501955bcf71b-kube-api-access-78xkm\") pod \"iptables-alerter-sgbrl\" (UID: \"83d0b332-6e36-4e2e-8231-501955bcf71b\") " pod="openshift-network-operator/iptables-alerter-sgbrl" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989110 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-host\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989112 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/932d1d43-95d3-476c-b3d2-da80b4fcf711-ovnkube-script-lib\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989156 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-cni-netd\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989259 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-kubernetes\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989282 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-lib-modules\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989348 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-kubernetes\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/29c60f5b-f12d-43ec-a794-f2abbe748308-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989415 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-lib-modules\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989460 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-multus-conf-dir\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.990025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989503 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bbz7\" (UniqueName: \"kubernetes.io/projected/6f769d40-1c0a-4957-8061-892b0f5e5266-kube-api-access-6bbz7\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-cni-netd\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989535 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs\") pod \"network-metrics-daemon-5w9cl\" (UID: \"0de99a89-e8e5-491a-90c3-5c371ed6705f\") " pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989632 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-multus-conf-dir\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:45.989665 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989674 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/29c60f5b-f12d-43ec-a794-f2abbe748308-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:45.989846 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs podName:0de99a89-e8e5-491a-90c3-5c371ed6705f nodeName:}" failed. No retries permitted until 2026-04-20 13:30:46.489801729 +0000 UTC m=+3.085492801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs") pod "network-metrics-daemon-5w9cl" (UID: "0de99a89-e8e5-491a-90c3-5c371ed6705f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989870 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f769d40-1c0a-4957-8061-892b0f5e5266-cni-binary-copy\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989934 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48jpt\" (UniqueName: \"kubernetes.io/projected/75eb0979-9a66-40d2-a063-6f592c87a4f1-kube-api-access-48jpt\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989944 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/29c60f5b-f12d-43ec-a794-f2abbe748308-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989962 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5e473fb6-5d6c-47e5-9f17-d87b134e316e-serviceca\") pod \"node-ca-kmm7p\" (UID: \"5e473fb6-5d6c-47e5-9f17-d87b134e316e\") " pod="openshift-image-registry/node-ca-kmm7p" Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.989986 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-sys-fs\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-run-systemd\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990027 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/29c60f5b-f12d-43ec-a794-f2abbe748308-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990043 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/83d0b332-6e36-4e2e-8231-501955bcf71b-iptables-alerter-script\") pod \"iptables-alerter-sgbrl\" (UID: \"83d0b332-6e36-4e2e-8231-501955bcf71b\") " pod="openshift-network-operator/iptables-alerter-sgbrl" Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990075 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-systemd-units\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990120 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-tuned\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.990857 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990182 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/75eb0979-9a66-40d2-a063-6f592c87a4f1-tmp\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990338 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/41710337-4f82-4bb8-abe7-f7a5cc3d9802-hosts-file\") pod \"node-resolver-4qdsh\" (UID: \"41710337-4f82-4bb8-abe7-f7a5cc3d9802\") " pod="openshift-dns/node-resolver-4qdsh" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990369 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a7673b95-6e38-4e6e-84a1-c083cd4e6356-agent-certs\") pod \"konnectivity-agent-tm5fj\" (UID: \"a7673b95-6e38-4e6e-84a1-c083cd4e6356\") " pod="kube-system/konnectivity-agent-tm5fj" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990425 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a7673b95-6e38-4e6e-84a1-c083cd4e6356-konnectivity-ca\") pod \"konnectivity-agent-tm5fj\" (UID: \"a7673b95-6e38-4e6e-84a1-c083cd4e6356\") " pod="kube-system/konnectivity-agent-tm5fj" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990452 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990477 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-etc-selinux\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990503 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-slash\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990533 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f769d40-1c0a-4957-8061-892b0f5e5266-cni-binary-copy\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990558 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5e473fb6-5d6c-47e5-9f17-d87b134e316e-serviceca\") pod \"node-ca-kmm7p\" (UID: \"5e473fb6-5d6c-47e5-9f17-d87b134e316e\") " pod="openshift-image-registry/node-ca-kmm7p" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-hostroot\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990606 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-hostroot\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-etc-kubernetes\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990672 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-sys-fs\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990674 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-run-ovn\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990716 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-run-systemd\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990734 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/932d1d43-95d3-476c-b3d2-da80b4fcf711-ovn-node-metrics-cert\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990791 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f769d40-1c0a-4957-8061-892b0f5e5266-etc-kubernetes\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990804 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-var-lib-kubelet\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.991585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990783 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-systemd-units\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.992399 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990832 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/29c60f5b-f12d-43ec-a794-f2abbe748308-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.992399 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.990881 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7gpw\" (UniqueName: \"kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw\") pod \"network-check-target-sb687\" (UID: \"ff4d6258-b35d-4f25-b171-7ce6265db518\") " pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:30:45.992399 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.991164 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 13:30:45.992399 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.991819 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/83d0b332-6e36-4e2e-8231-501955bcf71b-iptables-alerter-script\") pod \"iptables-alerter-sgbrl\" (UID: \"83d0b332-6e36-4e2e-8231-501955bcf71b\") " pod="openshift-network-operator/iptables-alerter-sgbrl" Apr 20 13:30:45.992399 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.992224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a7673b95-6e38-4e6e-84a1-c083cd4e6356-konnectivity-ca\") pod \"konnectivity-agent-tm5fj\" (UID: \"a7673b95-6e38-4e6e-84a1-c083cd4e6356\") " pod="kube-system/konnectivity-agent-tm5fj" Apr 20 13:30:45.992399 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.992332 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/41710337-4f82-4bb8-abe7-f7a5cc3d9802-hosts-file\") pod \"node-resolver-4qdsh\" (UID: \"41710337-4f82-4bb8-abe7-f7a5cc3d9802\") " pod="openshift-dns/node-resolver-4qdsh" Apr 20 13:30:45.993389 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.992919 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-etc-selinux\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.993389 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.992994 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9955ecd-fee7-409f-b733-5e9973245030-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:45.993389 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.993051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-host-slash\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.993389 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.993218 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/29c60f5b-f12d-43ec-a794-f2abbe748308-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:45.993389 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.993315 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/75eb0979-9a66-40d2-a063-6f592c87a4f1-var-lib-kubelet\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.994451 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.993708 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/932d1d43-95d3-476c-b3d2-da80b4fcf711-run-ovn\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:45.997603 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.997580 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/75eb0979-9a66-40d2-a063-6f592c87a4f1-etc-tuned\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:45.999315 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:45.999297 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a7673b95-6e38-4e6e-84a1-c083cd4e6356-agent-certs\") pod \"konnectivity-agent-tm5fj\" (UID: \"a7673b95-6e38-4e6e-84a1-c083cd4e6356\") " pod="kube-system/konnectivity-agent-tm5fj" Apr 20 13:30:46.000676 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.000653 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn4xw\" (UniqueName: \"kubernetes.io/projected/5e473fb6-5d6c-47e5-9f17-d87b134e316e-kube-api-access-kn4xw\") pod \"node-ca-kmm7p\" (UID: \"5e473fb6-5d6c-47e5-9f17-d87b134e316e\") " pod="openshift-image-registry/node-ca-kmm7p" Apr 20 13:30:46.001758 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.001210 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74dtq\" (UniqueName: \"kubernetes.io/projected/29c60f5b-f12d-43ec-a794-f2abbe748308-kube-api-access-74dtq\") pod \"multus-additional-cni-plugins-g88jt\" (UID: \"29c60f5b-f12d-43ec-a794-f2abbe748308\") " pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:46.003448 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.003080 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bbz7\" (UniqueName: \"kubernetes.io/projected/6f769d40-1c0a-4957-8061-892b0f5e5266-kube-api-access-6bbz7\") pod \"multus-2s69b\" (UID: \"6f769d40-1c0a-4957-8061-892b0f5e5266\") " pod="openshift-multus/multus-2s69b" Apr 20 13:30:46.003680 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.003657 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvcvg\" (UniqueName: \"kubernetes.io/projected/c9955ecd-fee7-409f-b733-5e9973245030-kube-api-access-vvcvg\") pod \"aws-ebs-csi-driver-node-lbp89\" (UID: \"c9955ecd-fee7-409f-b733-5e9973245030\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:46.003755 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:46.003690 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:30:46.003755 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:46.003710 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:30:46.003755 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.003713 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l6hx\" (UniqueName: \"kubernetes.io/projected/0de99a89-e8e5-491a-90c3-5c371ed6705f-kube-api-access-6l6hx\") pod \"network-metrics-daemon-5w9cl\" (UID: \"0de99a89-e8e5-491a-90c3-5c371ed6705f\") " pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:46.003755 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:46.003723 2573 projected.go:194] Error preparing data for projected volume kube-api-access-z7gpw for pod openshift-network-diagnostics/network-check-target-sb687: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:46.003967 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:46.003775 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw podName:ff4d6258-b35d-4f25-b171-7ce6265db518 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:46.503759928 +0000 UTC m=+3.099450995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z7gpw" (UniqueName: "kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw") pod "network-check-target-sb687" (UID: "ff4d6258-b35d-4f25-b171-7ce6265db518") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:46.003967 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.003886 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/75eb0979-9a66-40d2-a063-6f592c87a4f1-tmp\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:46.004136 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.004112 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/932d1d43-95d3-476c-b3d2-da80b4fcf711-ovn-node-metrics-cert\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:46.004373 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.004351 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d66n\" (UniqueName: \"kubernetes.io/projected/41710337-4f82-4bb8-abe7-f7a5cc3d9802-kube-api-access-9d66n\") pod \"node-resolver-4qdsh\" (UID: \"41710337-4f82-4bb8-abe7-f7a5cc3d9802\") " pod="openshift-dns/node-resolver-4qdsh" Apr 20 13:30:46.005645 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.005620 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78xkm\" (UniqueName: \"kubernetes.io/projected/83d0b332-6e36-4e2e-8231-501955bcf71b-kube-api-access-78xkm\") pod \"iptables-alerter-sgbrl\" (UID: \"83d0b332-6e36-4e2e-8231-501955bcf71b\") " pod="openshift-network-operator/iptables-alerter-sgbrl" Apr 20 13:30:46.005769 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.005748 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xfcj\" (UniqueName: \"kubernetes.io/projected/932d1d43-95d3-476c-b3d2-da80b4fcf711-kube-api-access-2xfcj\") pod \"ovnkube-node-drksq\" (UID: \"932d1d43-95d3-476c-b3d2-da80b4fcf711\") " pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:46.007224 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.007203 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48jpt\" (UniqueName: \"kubernetes.io/projected/75eb0979-9a66-40d2-a063-6f592c87a4f1-kube-api-access-48jpt\") pod \"tuned-gp6dp\" (UID: \"75eb0979-9a66-40d2-a063-6f592c87a4f1\") " pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:46.091934 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.091904 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3b76b883-7171-4efb-b2ac-fc558e9fdf79-kubelet-config\") pod \"global-pull-secret-syncer-5w2mr\" (UID: \"3b76b883-7171-4efb-b2ac-fc558e9fdf79\") " pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:46.091934 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.091935 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3b76b883-7171-4efb-b2ac-fc558e9fdf79-dbus\") pod \"global-pull-secret-syncer-5w2mr\" (UID: \"3b76b883-7171-4efb-b2ac-fc558e9fdf79\") " pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:46.092114 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.091960 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret\") pod \"global-pull-secret-syncer-5w2mr\" (UID: \"3b76b883-7171-4efb-b2ac-fc558e9fdf79\") " pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:46.092114 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.092014 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3b76b883-7171-4efb-b2ac-fc558e9fdf79-kubelet-config\") pod \"global-pull-secret-syncer-5w2mr\" (UID: \"3b76b883-7171-4efb-b2ac-fc558e9fdf79\") " pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:46.092114 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:46.092097 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 13:30:46.092248 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:46.092164 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret podName:3b76b883-7171-4efb-b2ac-fc558e9fdf79 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:46.59213194 +0000 UTC m=+3.187822995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret") pod "global-pull-secret-syncer-5w2mr" (UID: "3b76b883-7171-4efb-b2ac-fc558e9fdf79") : object "kube-system"/"original-pull-secret" not registered Apr 20 13:30:46.092248 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.092171 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3b76b883-7171-4efb-b2ac-fc558e9fdf79-dbus\") pod \"global-pull-secret-syncer-5w2mr\" (UID: \"3b76b883-7171-4efb-b2ac-fc558e9fdf79\") " pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:46.174514 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.174437 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sgbrl" Apr 20 13:30:46.184353 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.184330 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:30:46.193003 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.192982 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tm5fj" Apr 20 13:30:46.198619 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.198596 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" Apr 20 13:30:46.207196 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.207174 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kmm7p" Apr 20 13:30:46.213710 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.213694 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" Apr 20 13:30:46.221216 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.221197 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4qdsh" Apr 20 13:30:46.228722 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.228704 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g88jt" Apr 20 13:30:46.235252 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.235231 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2s69b" Apr 20 13:30:46.495476 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.495408 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs\") pod \"network-metrics-daemon-5w9cl\" (UID: \"0de99a89-e8e5-491a-90c3-5c371ed6705f\") " pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:46.495633 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:46.495525 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:46.495633 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:46.495584 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs podName:0de99a89-e8e5-491a-90c3-5c371ed6705f nodeName:}" failed. No retries permitted until 2026-04-20 13:30:47.495566459 +0000 UTC m=+4.091257527 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs") pod "network-metrics-daemon-5w9cl" (UID: "0de99a89-e8e5-491a-90c3-5c371ed6705f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:46.587593 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:46.587560 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7673b95_6e38_4e6e_84a1_c083cd4e6356.slice/crio-4d80e48be2b6eb205c04d91d77d21b450282d35f1f0bcc540bbc8b5bbfd06541 WatchSource:0}: Error finding container 4d80e48be2b6eb205c04d91d77d21b450282d35f1f0bcc540bbc8b5bbfd06541: Status 404 returned error can't find the container with id 4d80e48be2b6eb205c04d91d77d21b450282d35f1f0bcc540bbc8b5bbfd06541 Apr 20 13:30:46.595840 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.595820 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7gpw\" (UniqueName: \"kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw\") pod \"network-check-target-sb687\" (UID: \"ff4d6258-b35d-4f25-b171-7ce6265db518\") " pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:30:46.595941 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.595860 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret\") pod \"global-pull-secret-syncer-5w2mr\" (UID: \"3b76b883-7171-4efb-b2ac-fc558e9fdf79\") " pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:46.596002 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:46.595945 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 13:30:46.596002 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:46.595981 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:30:46.596002 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:46.596001 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:30:46.596127 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:46.596013 2573 projected.go:194] Error preparing data for projected volume kube-api-access-z7gpw for pod openshift-network-diagnostics/network-check-target-sb687: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:46.596127 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:46.595986 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret podName:3b76b883-7171-4efb-b2ac-fc558e9fdf79 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:47.595974461 +0000 UTC m=+4.191665515 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret") pod "global-pull-secret-syncer-5w2mr" (UID: "3b76b883-7171-4efb-b2ac-fc558e9fdf79") : object "kube-system"/"original-pull-secret" not registered Apr 20 13:30:46.596127 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:46.596073 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw podName:ff4d6258-b35d-4f25-b171-7ce6265db518 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:47.59605559 +0000 UTC m=+4.191746648 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-z7gpw" (UniqueName: "kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw") pod "network-check-target-sb687" (UID: "ff4d6258-b35d-4f25-b171-7ce6265db518") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:46.605374 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:46.605342 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83d0b332_6e36_4e2e_8231_501955bcf71b.slice/crio-0f8f46557f74f1d2f7cb98a0743dc920a10919eb0efc94d6f4a7890d670229e6 WatchSource:0}: Error finding container 0f8f46557f74f1d2f7cb98a0743dc920a10919eb0efc94d6f4a7890d670229e6: Status 404 returned error can't find the container with id 0f8f46557f74f1d2f7cb98a0743dc920a10919eb0efc94d6f4a7890d670229e6 Apr 20 13:30:46.606227 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:46.606205 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f769d40_1c0a_4957_8061_892b0f5e5266.slice/crio-6c8bc043af48bb7495ba1cfc8f94168e61b79330f0d9686be967f66f4d1bcaed WatchSource:0}: Error finding container 6c8bc043af48bb7495ba1cfc8f94168e61b79330f0d9686be967f66f4d1bcaed: Status 404 returned error can't find the container with id 6c8bc043af48bb7495ba1cfc8f94168e61b79330f0d9686be967f66f4d1bcaed Apr 20 13:30:46.606963 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:46.606942 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e473fb6_5d6c_47e5_9f17_d87b134e316e.slice/crio-6da6703cbebc699f50d40ba38c51130fa0128f4a4d3359103073438cc48ccc5f WatchSource:0}: Error finding container 6da6703cbebc699f50d40ba38c51130fa0128f4a4d3359103073438cc48ccc5f: Status 404 returned error can't find the container with id 6da6703cbebc699f50d40ba38c51130fa0128f4a4d3359103073438cc48ccc5f Apr 20 13:30:46.611659 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:46.611520 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9955ecd_fee7_409f_b733_5e9973245030.slice/crio-acae6e7a2819450f94809547fa478749d8c745eb4ea77cb942c39fb3daa5083e WatchSource:0}: Error finding container acae6e7a2819450f94809547fa478749d8c745eb4ea77cb942c39fb3daa5083e: Status 404 returned error can't find the container with id acae6e7a2819450f94809547fa478749d8c745eb4ea77cb942c39fb3daa5083e Apr 20 13:30:46.613413 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:46.613392 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c60f5b_f12d_43ec_a794_f2abbe748308.slice/crio-cb82b1af3f97049612e99ab6fad4247ea62bea9af6c52ddbf55cc22c1776d3e7 WatchSource:0}: Error finding container cb82b1af3f97049612e99ab6fad4247ea62bea9af6c52ddbf55cc22c1776d3e7: Status 404 returned error can't find the container with id cb82b1af3f97049612e99ab6fad4247ea62bea9af6c52ddbf55cc22c1776d3e7 Apr 20 13:30:46.614003 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:46.613989 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75eb0979_9a66_40d2_a063_6f592c87a4f1.slice/crio-247b0adf65ac14e7fe20f11ca84a1642b918e22c90a201765babf3f9612a6d82 WatchSource:0}: Error finding container 247b0adf65ac14e7fe20f11ca84a1642b918e22c90a201765babf3f9612a6d82: Status 404 returned error can't find the container with id 247b0adf65ac14e7fe20f11ca84a1642b918e22c90a201765babf3f9612a6d82 Apr 20 13:30:46.615031 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:46.615011 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41710337_4f82_4bb8_abe7_f7a5cc3d9802.slice/crio-11cbb9c6c9dc20baeabdc491123de0ab87494250a87b73b7b6c5ffe76e662f01 WatchSource:0}: Error finding container 11cbb9c6c9dc20baeabdc491123de0ab87494250a87b73b7b6c5ffe76e662f01: Status 404 returned error can't find the container with id 11cbb9c6c9dc20baeabdc491123de0ab87494250a87b73b7b6c5ffe76e662f01 Apr 20 13:30:46.616300 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:30:46.616176 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod932d1d43_95d3_476c_b3d2_da80b4fcf711.slice/crio-3b5ff3811fa7c09871d7f5068c23a3ef07bbaf94eb6d2962e4ef64d72a364e9a WatchSource:0}: Error finding container 3b5ff3811fa7c09871d7f5068c23a3ef07bbaf94eb6d2962e4ef64d72a364e9a: Status 404 returned error can't find the container with id 3b5ff3811fa7c09871d7f5068c23a3ef07bbaf94eb6d2962e4ef64d72a364e9a Apr 20 13:30:46.930136 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.929887 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 13:25:44 +0000 UTC" deadline="2027-11-29 12:19:06.401852736 +0000 UTC" Apr 20 13:30:46.930136 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.930131 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14110h48m19.471725658s" Apr 20 13:30:46.984691 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.984649 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g88jt" event={"ID":"29c60f5b-f12d-43ec-a794-f2abbe748308","Type":"ContainerStarted","Data":"cb82b1af3f97049612e99ab6fad4247ea62bea9af6c52ddbf55cc22c1776d3e7"} Apr 20 13:30:46.985821 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.985785 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" event={"ID":"c9955ecd-fee7-409f-b733-5e9973245030","Type":"ContainerStarted","Data":"acae6e7a2819450f94809547fa478749d8c745eb4ea77cb942c39fb3daa5083e"} Apr 20 13:30:46.987029 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.986988 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kmm7p" event={"ID":"5e473fb6-5d6c-47e5-9f17-d87b134e316e","Type":"ContainerStarted","Data":"6da6703cbebc699f50d40ba38c51130fa0128f4a4d3359103073438cc48ccc5f"} Apr 20 13:30:46.989065 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.989038 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2s69b" event={"ID":"6f769d40-1c0a-4957-8061-892b0f5e5266","Type":"ContainerStarted","Data":"6c8bc043af48bb7495ba1cfc8f94168e61b79330f0d9686be967f66f4d1bcaed"} Apr 20 13:30:46.990419 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.990396 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sgbrl" event={"ID":"83d0b332-6e36-4e2e-8231-501955bcf71b","Type":"ContainerStarted","Data":"0f8f46557f74f1d2f7cb98a0743dc920a10919eb0efc94d6f4a7890d670229e6"} Apr 20 13:30:46.993252 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.993230 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-144.ec2.internal" event={"ID":"1823e330d15f0fe92f9823b1c0261d30","Type":"ContainerStarted","Data":"eb439f86fca6d306e3ee689b694767c89198720413001d664873cf2c88813ab2"} Apr 20 13:30:46.995266 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.995221 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" event={"ID":"932d1d43-95d3-476c-b3d2-da80b4fcf711","Type":"ContainerStarted","Data":"3b5ff3811fa7c09871d7f5068c23a3ef07bbaf94eb6d2962e4ef64d72a364e9a"} Apr 20 13:30:46.996453 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.996409 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4qdsh" event={"ID":"41710337-4f82-4bb8-abe7-f7a5cc3d9802","Type":"ContainerStarted","Data":"11cbb9c6c9dc20baeabdc491123de0ab87494250a87b73b7b6c5ffe76e662f01"} Apr 20 13:30:46.998512 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:46.998470 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tm5fj" event={"ID":"a7673b95-6e38-4e6e-84a1-c083cd4e6356","Type":"ContainerStarted","Data":"4d80e48be2b6eb205c04d91d77d21b450282d35f1f0bcc540bbc8b5bbfd06541"} Apr 20 13:30:47.000113 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:47.000077 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" event={"ID":"75eb0979-9a66-40d2-a063-6f592c87a4f1","Type":"ContainerStarted","Data":"247b0adf65ac14e7fe20f11ca84a1642b918e22c90a201765babf3f9612a6d82"} Apr 20 13:30:47.006756 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:47.006378 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-144.ec2.internal" podStartSLOduration=2.006363764 podStartE2EDuration="2.006363764s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:30:47.006245196 +0000 UTC m=+3.601936273" watchObservedRunningTime="2026-04-20 13:30:47.006363764 +0000 UTC m=+3.602054838" Apr 20 13:30:47.513456 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:47.513421 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs\") pod \"network-metrics-daemon-5w9cl\" (UID: \"0de99a89-e8e5-491a-90c3-5c371ed6705f\") " pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:47.513633 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:47.513558 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:47.513633 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:47.513618 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs podName:0de99a89-e8e5-491a-90c3-5c371ed6705f nodeName:}" failed. No retries permitted until 2026-04-20 13:30:49.513599448 +0000 UTC m=+6.109290518 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs") pod "network-metrics-daemon-5w9cl" (UID: "0de99a89-e8e5-491a-90c3-5c371ed6705f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:47.613837 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:47.613797 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret\") pod \"global-pull-secret-syncer-5w2mr\" (UID: \"3b76b883-7171-4efb-b2ac-fc558e9fdf79\") " pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:47.614004 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:47.613877 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7gpw\" (UniqueName: \"kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw\") pod \"network-check-target-sb687\" (UID: \"ff4d6258-b35d-4f25-b171-7ce6265db518\") " pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:30:47.614069 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:47.614003 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 13:30:47.614069 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:47.614025 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:30:47.614069 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:47.614042 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:30:47.614069 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:47.614055 2573 projected.go:194] Error preparing data for projected volume kube-api-access-z7gpw for pod openshift-network-diagnostics/network-check-target-sb687: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:47.614294 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:47.614073 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret podName:3b76b883-7171-4efb-b2ac-fc558e9fdf79 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:49.614052821 +0000 UTC m=+6.209743890 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret") pod "global-pull-secret-syncer-5w2mr" (UID: "3b76b883-7171-4efb-b2ac-fc558e9fdf79") : object "kube-system"/"original-pull-secret" not registered Apr 20 13:30:47.614294 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:47.614113 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw podName:ff4d6258-b35d-4f25-b171-7ce6265db518 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:49.614101612 +0000 UTC m=+6.209792669 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-z7gpw" (UniqueName: "kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw") pod "network-check-target-sb687" (UID: "ff4d6258-b35d-4f25-b171-7ce6265db518") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:47.977383 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:47.977304 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:47.977804 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:47.977443 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:30:47.977905 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:47.977885 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:30:47.978005 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:47.977984 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:30:47.978087 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:47.978073 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:47.978201 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:47.978181 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:30:48.028257 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:48.028218 2573 generic.go:358] "Generic (PLEG): container finished" podID="37832f134222998e163934b3a7a5f97c" containerID="82e3eb91e5a7648f0f7fc84b47b5397d5c750c602678b78199807ed7220cfcab" exitCode=0 Apr 20 13:30:48.028550 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:48.028489 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal" event={"ID":"37832f134222998e163934b3a7a5f97c","Type":"ContainerDied","Data":"82e3eb91e5a7648f0f7fc84b47b5397d5c750c602678b78199807ed7220cfcab"} Apr 20 13:30:49.038903 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:49.038292 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal" event={"ID":"37832f134222998e163934b3a7a5f97c","Type":"ContainerStarted","Data":"1f81e5aa8102c7fe43d52be5668f99452979b5cbce18f590c456a5edcc0758eb"} Apr 20 13:30:49.529955 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:49.529920 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs\") pod \"network-metrics-daemon-5w9cl\" (UID: \"0de99a89-e8e5-491a-90c3-5c371ed6705f\") " pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:49.530166 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:49.530111 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:49.530233 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:49.530185 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs podName:0de99a89-e8e5-491a-90c3-5c371ed6705f nodeName:}" failed. No retries permitted until 2026-04-20 13:30:53.530167511 +0000 UTC m=+10.125858580 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs") pod "network-metrics-daemon-5w9cl" (UID: "0de99a89-e8e5-491a-90c3-5c371ed6705f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:49.630450 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:49.630414 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret\") pod \"global-pull-secret-syncer-5w2mr\" (UID: \"3b76b883-7171-4efb-b2ac-fc558e9fdf79\") " pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:49.630618 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:49.630487 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7gpw\" (UniqueName: \"kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw\") pod \"network-check-target-sb687\" (UID: \"ff4d6258-b35d-4f25-b171-7ce6265db518\") " pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:30:49.630682 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:49.630615 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:30:49.630682 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:49.630636 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:30:49.630682 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:49.630648 2573 projected.go:194] Error preparing data for projected volume kube-api-access-z7gpw for pod openshift-network-diagnostics/network-check-target-sb687: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:49.630824 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:49.630700 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw podName:ff4d6258-b35d-4f25-b171-7ce6265db518 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:53.630682648 +0000 UTC m=+10.226373716 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-z7gpw" (UniqueName: "kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw") pod "network-check-target-sb687" (UID: "ff4d6258-b35d-4f25-b171-7ce6265db518") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:49.630887 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:49.630824 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 13:30:49.630933 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:49.630887 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret podName:3b76b883-7171-4efb-b2ac-fc558e9fdf79 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:53.630870924 +0000 UTC m=+10.226561993 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret") pod "global-pull-secret-syncer-5w2mr" (UID: "3b76b883-7171-4efb-b2ac-fc558e9fdf79") : object "kube-system"/"original-pull-secret" not registered Apr 20 13:30:49.977953 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:49.977924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:49.978155 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:49.978048 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:30:49.978494 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:49.978458 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:30:49.978607 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:49.978584 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:30:49.978684 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:49.978665 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:49.978771 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:49.978753 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:30:51.976168 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:51.975961 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:51.976168 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:51.975961 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:30:51.976168 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:51.976095 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:30:51.976713 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:51.976190 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:30:51.976713 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:51.976239 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:51.976713 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:51.976307 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:30:53.563339 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:53.563293 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs\") pod \"network-metrics-daemon-5w9cl\" (UID: \"0de99a89-e8e5-491a-90c3-5c371ed6705f\") " pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:53.563784 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:53.563435 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:53.563784 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:53.563505 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs podName:0de99a89-e8e5-491a-90c3-5c371ed6705f nodeName:}" failed. No retries permitted until 2026-04-20 13:31:01.563484464 +0000 UTC m=+18.159175535 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs") pod "network-metrics-daemon-5w9cl" (UID: "0de99a89-e8e5-491a-90c3-5c371ed6705f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:53.664850 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:53.664187 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret\") pod \"global-pull-secret-syncer-5w2mr\" (UID: \"3b76b883-7171-4efb-b2ac-fc558e9fdf79\") " pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:53.664850 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:53.664269 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7gpw\" (UniqueName: \"kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw\") pod \"network-check-target-sb687\" (UID: \"ff4d6258-b35d-4f25-b171-7ce6265db518\") " pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:30:53.664850 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:53.664402 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:30:53.664850 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:53.664422 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:30:53.664850 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:53.664434 2573 projected.go:194] Error preparing data for projected volume kube-api-access-z7gpw for pod openshift-network-diagnostics/network-check-target-sb687: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:53.664850 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:53.664494 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw podName:ff4d6258-b35d-4f25-b171-7ce6265db518 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:01.664475881 +0000 UTC m=+18.260166952 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-z7gpw" (UniqueName: "kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw") pod "network-check-target-sb687" (UID: "ff4d6258-b35d-4f25-b171-7ce6265db518") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:53.664850 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:53.664577 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 13:30:53.664850 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:53.664627 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret podName:3b76b883-7171-4efb-b2ac-fc558e9fdf79 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:01.664614342 +0000 UTC m=+18.260305407 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret") pod "global-pull-secret-syncer-5w2mr" (UID: "3b76b883-7171-4efb-b2ac-fc558e9fdf79") : object "kube-system"/"original-pull-secret" not registered Apr 20 13:30:53.976595 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:53.976509 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:30:53.976746 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:53.976623 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:53.976805 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:53.976625 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:30:53.976805 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:53.976747 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:53.976805 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:53.976779 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:30:53.976966 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:53.976853 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:30:55.975774 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:55.975737 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:55.976320 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:55.975791 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:30:55.976320 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:55.975875 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:30:55.976320 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:55.975939 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:30:55.976320 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:55.975959 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:55.976320 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:55.976066 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:30:57.976277 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:57.976240 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:57.976797 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:57.976240 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:30:57.976797 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:57.976359 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:30:57.976797 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:57.976452 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:30:57.976797 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:57.976245 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:57.976797 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:57.976563 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:30:59.975490 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:59.975453 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:30:59.975889 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:59.975493 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:30:59.975889 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:30:59.975453 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:30:59.975889 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:59.975605 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:30:59.975889 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:59.975720 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:30:59.975889 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:30:59.975808 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:31:01.624470 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:01.624433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs\") pod \"network-metrics-daemon-5w9cl\" (UID: \"0de99a89-e8e5-491a-90c3-5c371ed6705f\") " pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:31:01.624921 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:01.624600 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:01.624921 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:01.624666 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs podName:0de99a89-e8e5-491a-90c3-5c371ed6705f nodeName:}" failed. No retries permitted until 2026-04-20 13:31:17.624647589 +0000 UTC m=+34.220338664 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs") pod "network-metrics-daemon-5w9cl" (UID: "0de99a89-e8e5-491a-90c3-5c371ed6705f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:01.724772 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:01.724738 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret\") pod \"global-pull-secret-syncer-5w2mr\" (UID: \"3b76b883-7171-4efb-b2ac-fc558e9fdf79\") " pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:31:01.724923 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:01.724806 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7gpw\" (UniqueName: \"kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw\") pod \"network-check-target-sb687\" (UID: \"ff4d6258-b35d-4f25-b171-7ce6265db518\") " pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:31:01.724923 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:01.724886 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 13:31:01.724923 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:01.724904 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:31:01.724923 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:01.724918 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:31:01.725083 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:01.724927 2573 projected.go:194] Error preparing data for projected volume kube-api-access-z7gpw for pod openshift-network-diagnostics/network-check-target-sb687: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:01.725083 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:01.724946 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret podName:3b76b883-7171-4efb-b2ac-fc558e9fdf79 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:17.7249303 +0000 UTC m=+34.320621373 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret") pod "global-pull-secret-syncer-5w2mr" (UID: "3b76b883-7171-4efb-b2ac-fc558e9fdf79") : object "kube-system"/"original-pull-secret" not registered Apr 20 13:31:01.725083 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:01.724969 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw podName:ff4d6258-b35d-4f25-b171-7ce6265db518 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:17.724958124 +0000 UTC m=+34.320649183 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-z7gpw" (UniqueName: "kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw") pod "network-check-target-sb687" (UID: "ff4d6258-b35d-4f25-b171-7ce6265db518") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:01.976257 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:01.976169 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:31:01.976257 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:01.976187 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:31:01.976451 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:01.976169 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:31:01.976451 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:01.976288 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:31:01.976451 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:01.976398 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:31:01.976644 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:01.976472 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:31:03.976553 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:03.976381 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:31:03.977029 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:03.976457 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:31:03.977029 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:03.976615 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:31:03.977029 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:03.976715 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:31:03.977029 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:03.976473 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:31:03.977029 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:03.976783 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:31:04.065642 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.065605 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" event={"ID":"75eb0979-9a66-40d2-a063-6f592c87a4f1","Type":"ContainerStarted","Data":"ae5bc2e9a8fe47609015412ba0050bc41b8432f70701d0d0b7602a94fdb1bb3c"} Apr 20 13:31:04.066995 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.066964 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g88jt" event={"ID":"29c60f5b-f12d-43ec-a794-f2abbe748308","Type":"ContainerStarted","Data":"3e9b1acca32e0ba0f5ebea3ace29e21cc377f30173a4bfafbca406c90ba85727"} Apr 20 13:31:04.068458 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.068434 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" event={"ID":"c9955ecd-fee7-409f-b733-5e9973245030","Type":"ContainerStarted","Data":"b2cf3003eb1558f2e664fe29b8953d1394da0c608626e8fd551467317974d61b"} Apr 20 13:31:04.069777 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.069751 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kmm7p" event={"ID":"5e473fb6-5d6c-47e5-9f17-d87b134e316e","Type":"ContainerStarted","Data":"bf76d4783345fb3f488c8defe9ea96edb3f56a586e6effe19fb93a293b8ef218"} Apr 20 13:31:04.070913 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.070892 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2s69b" event={"ID":"6f769d40-1c0a-4957-8061-892b0f5e5266","Type":"ContainerStarted","Data":"03651ee52bceb3e42dbccc18a0fc90bafefaae1a563976f73ce4bbeb1eacb22b"} Apr 20 13:31:04.072740 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.072722 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 13:31:04.072987 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.072970 2573 generic.go:358] "Generic (PLEG): container finished" podID="932d1d43-95d3-476c-b3d2-da80b4fcf711" containerID="13d9bd2a9bc9dee30f273f3f5f3a734e568b94b842ef890ac21a808ca5f5fa48" exitCode=1 Apr 20 13:31:04.073043 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.073023 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" event={"ID":"932d1d43-95d3-476c-b3d2-da80b4fcf711","Type":"ContainerStarted","Data":"ea8d201bd2cca46a6854231dd9ba6f62dde757bb210e85455a707ee52172174e"} Apr 20 13:31:04.073090 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.073044 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" event={"ID":"932d1d43-95d3-476c-b3d2-da80b4fcf711","Type":"ContainerStarted","Data":"726246e4b8c3c0fa20c3b6f3ee91753fa9e2afccf69566b3c4add8b0758c3033"} Apr 20 13:31:04.073090 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.073054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" event={"ID":"932d1d43-95d3-476c-b3d2-da80b4fcf711","Type":"ContainerDied","Data":"13d9bd2a9bc9dee30f273f3f5f3a734e568b94b842ef890ac21a808ca5f5fa48"} Apr 20 13:31:04.073090 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.073064 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" event={"ID":"932d1d43-95d3-476c-b3d2-da80b4fcf711","Type":"ContainerStarted","Data":"393c1db741d26894605bb45fb9bbb2d82f4b35bd26b740f24aabd168cd458052"} Apr 20 13:31:04.074053 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.074036 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4qdsh" event={"ID":"41710337-4f82-4bb8-abe7-f7a5cc3d9802","Type":"ContainerStarted","Data":"e59f075a4c0f4b0932591e5ac53a7e95f9b697cc89558cd781616402cf64e8d4"} Apr 20 13:31:04.075255 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.075235 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tm5fj" event={"ID":"a7673b95-6e38-4e6e-84a1-c083cd4e6356","Type":"ContainerStarted","Data":"acc66e098a58530540ea863004d07a6b42c946e0146346e962e1fc83a25030a8"} Apr 20 13:31:04.086843 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.086806 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-144.ec2.internal" podStartSLOduration=19.086793168 podStartE2EDuration="19.086793168s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:30:49.053529071 +0000 UTC m=+5.649220149" watchObservedRunningTime="2026-04-20 13:31:04.086793168 +0000 UTC m=+20.682484243" Apr 20 13:31:04.099789 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.099744 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-gp6dp" podStartSLOduration=3.5592176650000003 podStartE2EDuration="20.099730881s" podCreationTimestamp="2026-04-20 13:30:44 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.616436567 +0000 UTC m=+3.212127621" lastFinishedPulling="2026-04-20 13:31:03.156949774 +0000 UTC m=+19.752640837" observedRunningTime="2026-04-20 13:31:04.087212679 +0000 UTC m=+20.682903756" watchObservedRunningTime="2026-04-20 13:31:04.099730881 +0000 UTC m=+20.695421959" Apr 20 13:31:04.116266 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.116227 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tm5fj" podStartSLOduration=3.318215777 podStartE2EDuration="20.116216499s" podCreationTimestamp="2026-04-20 13:30:44 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.604295431 +0000 UTC m=+3.199986486" lastFinishedPulling="2026-04-20 13:31:03.402296152 +0000 UTC m=+19.997987208" observedRunningTime="2026-04-20 13:31:04.116087834 +0000 UTC m=+20.711778924" watchObservedRunningTime="2026-04-20 13:31:04.116216499 +0000 UTC m=+20.711907574" Apr 20 13:31:04.116378 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.116336 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kmm7p" podStartSLOduration=3.323310814 podStartE2EDuration="20.116332568s" podCreationTimestamp="2026-04-20 13:30:44 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.609273197 +0000 UTC m=+3.204964264" lastFinishedPulling="2026-04-20 13:31:03.402294949 +0000 UTC m=+19.997986018" observedRunningTime="2026-04-20 13:31:04.099259094 +0000 UTC m=+20.694950182" watchObservedRunningTime="2026-04-20 13:31:04.116332568 +0000 UTC m=+20.712023643" Apr 20 13:31:04.136030 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.135983 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4qdsh" podStartSLOduration=3.314656188 podStartE2EDuration="20.135969954s" podCreationTimestamp="2026-04-20 13:30:44 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.616964342 +0000 UTC m=+3.212655402" lastFinishedPulling="2026-04-20 13:31:03.4382781 +0000 UTC m=+20.033969168" observedRunningTime="2026-04-20 13:31:04.135478841 +0000 UTC m=+20.731169919" watchObservedRunningTime="2026-04-20 13:31:04.135969954 +0000 UTC m=+20.731661032" Apr 20 13:31:04.179413 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:04.179358 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2s69b" podStartSLOduration=3.325729333 podStartE2EDuration="20.179339467s" podCreationTimestamp="2026-04-20 13:30:44 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.609720611 +0000 UTC m=+3.205411680" lastFinishedPulling="2026-04-20 13:31:03.463330755 +0000 UTC m=+20.059021814" observedRunningTime="2026-04-20 13:31:04.178596383 +0000 UTC m=+20.774287473" watchObservedRunningTime="2026-04-20 13:31:04.179339467 +0000 UTC m=+20.775030544" Apr 20 13:31:05.079900 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:05.079678 2573 generic.go:358] "Generic (PLEG): container finished" podID="29c60f5b-f12d-43ec-a794-f2abbe748308" containerID="3e9b1acca32e0ba0f5ebea3ace29e21cc377f30173a4bfafbca406c90ba85727" exitCode=0 Apr 20 13:31:05.080562 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:05.079749 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g88jt" event={"ID":"29c60f5b-f12d-43ec-a794-f2abbe748308","Type":"ContainerDied","Data":"3e9b1acca32e0ba0f5ebea3ace29e21cc377f30173a4bfafbca406c90ba85727"} Apr 20 13:31:05.081381 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:05.081357 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sgbrl" event={"ID":"83d0b332-6e36-4e2e-8231-501955bcf71b","Type":"ContainerStarted","Data":"8243d82acb5e055b93f5dcb70f511f971c0a1e9e8280322ff447a0bb32f15f0b"} Apr 20 13:31:05.084692 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:05.084673 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 13:31:05.085075 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:05.085043 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" event={"ID":"932d1d43-95d3-476c-b3d2-da80b4fcf711","Type":"ContainerStarted","Data":"91e1d5ae2f4756bea1684c58403ab760c997317a33b6c627100e88386a00ccb1"} Apr 20 13:31:05.085204 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:05.085187 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" event={"ID":"932d1d43-95d3-476c-b3d2-da80b4fcf711","Type":"ContainerStarted","Data":"cf99f135e0a9ae6f9f26abb43de177cbb8e7b2042a26f73eaf91b68b2559c8ec"} Apr 20 13:31:05.112399 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:05.112348 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-sgbrl" podStartSLOduration=4.281788749 podStartE2EDuration="21.112329939s" podCreationTimestamp="2026-04-20 13:30:44 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.609391081 +0000 UTC m=+3.205082142" lastFinishedPulling="2026-04-20 13:31:03.439932275 +0000 UTC m=+20.035623332" observedRunningTime="2026-04-20 13:31:05.111741039 +0000 UTC m=+21.707432114" watchObservedRunningTime="2026-04-20 13:31:05.112329939 +0000 UTC m=+21.708021016" Apr 20 13:31:05.226732 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:05.226702 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 13:31:05.946436 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:05.946247 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T13:31:05.226725449Z","UUID":"17f5d21a-f040-407d-bc40-378471b2d0ff","Handler":null,"Name":"","Endpoint":""} Apr 20 13:31:05.948860 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:05.948837 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 13:31:05.948986 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:05.948868 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 13:31:05.975289 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:05.975258 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:31:05.975424 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:05.975258 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:31:05.975581 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:05.975258 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:31:05.975581 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:05.975492 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:31:05.975706 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:05.975375 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:31:05.978613 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:05.976410 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:31:06.077966 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:06.077927 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tm5fj" Apr 20 13:31:06.078639 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:06.078614 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tm5fj" Apr 20 13:31:06.088897 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:06.088868 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" event={"ID":"c9955ecd-fee7-409f-b733-5e9973245030","Type":"ContainerStarted","Data":"040c3086c8f5ff6fabc1ca8768f7159c2b35be45f6042817ef36e7fff9dc728f"} Apr 20 13:31:06.089310 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:06.089247 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tm5fj" Apr 20 13:31:06.089577 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:06.089552 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tm5fj" Apr 20 13:31:07.093396 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:07.093133 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" event={"ID":"c9955ecd-fee7-409f-b733-5e9973245030","Type":"ContainerStarted","Data":"a892831f93224c4ced955de07da38f5832346cfd7716591648f6ea0bb0b12cfc"} Apr 20 13:31:07.096813 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:07.096788 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 13:31:07.097214 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:07.097172 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" event={"ID":"932d1d43-95d3-476c-b3d2-da80b4fcf711","Type":"ContainerStarted","Data":"cefce909ae05d1fb4d0dbf62ca9e6a3c0f9c369a4893fdb85ae30dbdd872bff2"} Apr 20 13:31:07.108115 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:07.108065 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lbp89" podStartSLOduration=3.222057093 podStartE2EDuration="23.108051497s" podCreationTimestamp="2026-04-20 13:30:44 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.613427975 +0000 UTC m=+3.209119045" lastFinishedPulling="2026-04-20 13:31:06.49942238 +0000 UTC m=+23.095113449" observedRunningTime="2026-04-20 13:31:07.107542261 +0000 UTC m=+23.703233354" watchObservedRunningTime="2026-04-20 13:31:07.108051497 +0000 UTC m=+23.703742564" Apr 20 13:31:07.975913 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:07.975870 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:31:07.976094 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:07.975980 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:31:07.976094 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:07.975998 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:31:07.976094 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:07.976031 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:31:07.976280 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:07.976115 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:31:07.976280 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:07.976229 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:31:09.975714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:09.975533 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:31:09.976402 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:09.975531 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:31:09.976402 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:09.975788 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:31:09.976402 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:09.975864 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:31:09.976402 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:09.975531 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:31:09.976402 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:09.975945 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:31:10.104706 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:10.104680 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 13:31:10.104985 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:10.104960 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" event={"ID":"932d1d43-95d3-476c-b3d2-da80b4fcf711","Type":"ContainerStarted","Data":"da757d97041ff86c264fdfe450f1e12e4a97538b33feb25fd5248d3bf5ef579a"} Apr 20 13:31:10.105264 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:10.105241 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:31:10.105570 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:10.105552 2573 scope.go:117] "RemoveContainer" containerID="13d9bd2a9bc9dee30f273f3f5f3a734e568b94b842ef890ac21a808ca5f5fa48" Apr 20 13:31:10.106709 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:10.106689 2573 generic.go:358] "Generic (PLEG): container finished" podID="29c60f5b-f12d-43ec-a794-f2abbe748308" containerID="20227ba8b394de982f9a3d8afe291a3bc58ae4e47b1be6bf96b347f0ba89009e" exitCode=0 Apr 20 13:31:10.106772 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:10.106728 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g88jt" event={"ID":"29c60f5b-f12d-43ec-a794-f2abbe748308","Type":"ContainerDied","Data":"20227ba8b394de982f9a3d8afe291a3bc58ae4e47b1be6bf96b347f0ba89009e"} Apr 20 13:31:10.120494 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:10.120475 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:31:11.110714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:11.110680 2573 generic.go:358] "Generic (PLEG): container finished" podID="29c60f5b-f12d-43ec-a794-f2abbe748308" containerID="7bbf0bfd274a28795fe984987f192d98a55b16d293c59544852d9934c7c06568" exitCode=0 Apr 20 13:31:11.111221 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:11.110725 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g88jt" event={"ID":"29c60f5b-f12d-43ec-a794-f2abbe748308","Type":"ContainerDied","Data":"7bbf0bfd274a28795fe984987f192d98a55b16d293c59544852d9934c7c06568"} Apr 20 13:31:11.115998 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:11.115980 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 13:31:11.116325 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:11.116304 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" event={"ID":"932d1d43-95d3-476c-b3d2-da80b4fcf711","Type":"ContainerStarted","Data":"157faecbff3969ad751e78991677a45ce6ceda67c557c380b25059609979e632"} Apr 20 13:31:11.116709 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:11.116690 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:31:11.116815 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:11.116714 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:31:11.131092 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:11.131069 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:31:11.157245 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:11.157202 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" podStartSLOduration=10.288920286 podStartE2EDuration="27.157186574s" podCreationTimestamp="2026-04-20 13:30:44 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.618401082 +0000 UTC m=+3.214092135" lastFinishedPulling="2026-04-20 13:31:03.486667368 +0000 UTC m=+20.082358423" observedRunningTime="2026-04-20 13:31:11.15606914 +0000 UTC m=+27.751760215" watchObservedRunningTime="2026-04-20 13:31:11.157186574 +0000 UTC m=+27.752877650" Apr 20 13:31:11.174329 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:11.174301 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sb687"] Apr 20 13:31:11.174481 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:11.174410 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:31:11.174541 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:11.174509 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:31:11.177116 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:11.177089 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5w9cl"] Apr 20 13:31:11.177267 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:11.177219 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:31:11.177347 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:11.177329 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:31:11.177790 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:11.177769 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5w2mr"] Apr 20 13:31:11.177886 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:11.177874 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:31:11.177989 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:11.177971 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:31:12.120048 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:12.119738 2573 generic.go:358] "Generic (PLEG): container finished" podID="29c60f5b-f12d-43ec-a794-f2abbe748308" containerID="65504d5d567cdc44c3ac5a6b8c80aa5a3510767eb1718b6755757e78f08d513f" exitCode=0 Apr 20 13:31:12.120456 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:12.119819 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g88jt" event={"ID":"29c60f5b-f12d-43ec-a794-f2abbe748308","Type":"ContainerDied","Data":"65504d5d567cdc44c3ac5a6b8c80aa5a3510767eb1718b6755757e78f08d513f"} Apr 20 13:31:12.975489 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:12.975461 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:31:12.975637 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:12.975465 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:31:12.975637 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:12.975581 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:31:12.975757 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:12.975678 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:31:12.975757 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:12.975465 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:31:12.975856 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:12.975771 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:31:14.976116 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:14.976085 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:31:14.976883 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:14.976085 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:31:14.976883 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:14.976201 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:31:14.976883 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:14.976206 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb687" podUID="ff4d6258-b35d-4f25-b171-7ce6265db518" Apr 20 13:31:14.976883 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:14.976281 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5w2mr" podUID="3b76b883-7171-4efb-b2ac-fc558e9fdf79" Apr 20 13:31:14.976883 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:14.976360 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5w9cl" podUID="0de99a89-e8e5-491a-90c3-5c371ed6705f" Apr 20 13:31:16.758303 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.758225 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-144.ec2.internal" event="NodeReady" Apr 20 13:31:16.758791 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.758352 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 13:31:16.794495 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.794463 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7889fdc99c-s6dfk"] Apr 20 13:31:16.796834 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.796810 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:16.799230 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.799094 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 13:31:16.799402 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.799376 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xw967\"" Apr 20 13:31:16.799633 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.799597 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 13:31:16.799739 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.799643 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 13:31:16.804465 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.804441 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 13:31:16.806558 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.806538 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p7fbq"] Apr 20 13:31:16.809379 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.809357 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:16.809563 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.809522 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7889fdc99c-s6dfk"] Apr 20 13:31:16.811568 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.811548 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hdgh8"] Apr 20 13:31:16.811692 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.811674 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 13:31:16.811769 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.811575 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-c8wr5\"" Apr 20 13:31:16.811769 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.811628 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 13:31:16.813575 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.813555 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hdgh8" Apr 20 13:31:16.815781 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.815762 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mqqqc\"" Apr 20 13:31:16.816022 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.815965 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 13:31:16.816213 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.816194 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 13:31:16.816310 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.816235 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 13:31:16.819774 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.819754 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p7fbq"] Apr 20 13:31:16.824607 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.824587 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hdgh8"] Apr 20 13:31:16.946151 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.946102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-trusted-ca\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:16.946326 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.946168 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-image-registry-private-configuration\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:16.946326 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.946200 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-ca-trust-extracted\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:16.946326 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.946223 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsbtj\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-kube-api-access-gsbtj\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:16.946326 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.946251 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-bound-sa-token\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:16.946501 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.946346 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s2qk\" (UniqueName: \"kubernetes.io/projected/8d3811b3-7e75-4345-b591-277c5aecb5fd-kube-api-access-9s2qk\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:16.946501 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.946390 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-certificates\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:16.946501 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.946445 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert\") pod \"ingress-canary-hdgh8\" (UID: \"8e1662ff-63f6-4f08-9e96-75f038878584\") " pod="openshift-ingress-canary/ingress-canary-hdgh8" Apr 20 13:31:16.946501 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.946483 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8d3811b3-7e75-4345-b591-277c5aecb5fd-tmp-dir\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:16.946650 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.946526 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d3811b3-7e75-4345-b591-277c5aecb5fd-config-volume\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:16.946650 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.946576 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4whs\" (UniqueName: \"kubernetes.io/projected/8e1662ff-63f6-4f08-9e96-75f038878584-kube-api-access-w4whs\") pod \"ingress-canary-hdgh8\" (UID: \"8e1662ff-63f6-4f08-9e96-75f038878584\") " pod="openshift-ingress-canary/ingress-canary-hdgh8" Apr 20 13:31:16.946650 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.946603 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-installation-pull-secrets\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:16.946650 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.946627 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:16.946809 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.946653 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:16.975666 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.975628 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:31:16.975834 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.975628 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:31:16.975834 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.975628 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:31:16.978536 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.978504 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 13:31:16.978674 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.978576 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wf844\"" Apr 20 13:31:16.978674 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.978582 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 13:31:16.978674 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.978635 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 13:31:16.978865 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.978841 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 13:31:16.978979 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:16.978885 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gqjcl\"" Apr 20 13:31:17.047331 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.047299 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert\") pod \"ingress-canary-hdgh8\" (UID: \"8e1662ff-63f6-4f08-9e96-75f038878584\") " pod="openshift-ingress-canary/ingress-canary-hdgh8" Apr 20 13:31:17.047492 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.047340 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8d3811b3-7e75-4345-b591-277c5aecb5fd-tmp-dir\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:17.047492 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.047381 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d3811b3-7e75-4345-b591-277c5aecb5fd-config-volume\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:17.047492 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.047423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4whs\" (UniqueName: \"kubernetes.io/projected/8e1662ff-63f6-4f08-9e96-75f038878584-kube-api-access-w4whs\") pod \"ingress-canary-hdgh8\" (UID: \"8e1662ff-63f6-4f08-9e96-75f038878584\") " pod="openshift-ingress-canary/ingress-canary-hdgh8" Apr 20 13:31:17.047492 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.047443 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 13:31:17.047697 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.047508 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert podName:8e1662ff-63f6-4f08-9e96-75f038878584 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:17.547490225 +0000 UTC m=+34.143181294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert") pod "ingress-canary-hdgh8" (UID: "8e1662ff-63f6-4f08-9e96-75f038878584") : secret "canary-serving-cert" not found Apr 20 13:31:17.047697 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.047450 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-installation-pull-secrets\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.047697 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.047547 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:17.047697 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.047578 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.047697 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.047623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-trusted-ca\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.047697 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.047655 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-image-registry-private-configuration\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.047697 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.047682 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-ca-trust-extracted\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.048044 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.047707 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsbtj\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-kube-api-access-gsbtj\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.048044 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.047727 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 13:31:17.048044 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.047741 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7889fdc99c-s6dfk: secret "image-registry-tls" not found Apr 20 13:31:17.048044 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.047750 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8d3811b3-7e75-4345-b591-277c5aecb5fd-tmp-dir\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:17.048044 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.047782 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls podName:1314261e-ac1b-4ca4-8c20-11cf3ad0f281 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:17.547766444 +0000 UTC m=+34.143457513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls") pod "image-registry-7889fdc99c-s6dfk" (UID: "1314261e-ac1b-4ca4-8c20-11cf3ad0f281") : secret "image-registry-tls" not found Apr 20 13:31:17.048044 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.047842 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 13:31:17.048044 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.047877 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls podName:8d3811b3-7e75-4345-b591-277c5aecb5fd nodeName:}" failed. No retries permitted until 2026-04-20 13:31:17.547863795 +0000 UTC m=+34.143554856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls") pod "dns-default-p7fbq" (UID: "8d3811b3-7e75-4345-b591-277c5aecb5fd") : secret "dns-default-metrics-tls" not found Apr 20 13:31:17.048044 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.047738 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-bound-sa-token\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.048044 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.047931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9s2qk\" (UniqueName: \"kubernetes.io/projected/8d3811b3-7e75-4345-b591-277c5aecb5fd-kube-api-access-9s2qk\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:17.048044 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.048017 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d3811b3-7e75-4345-b591-277c5aecb5fd-config-volume\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:17.048527 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.048079 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-certificates\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.048527 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.048222 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-ca-trust-extracted\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.048734 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.048707 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-certificates\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.048854 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.048790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-trusted-ca\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.052072 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.052027 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-installation-pull-secrets\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.052072 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.052037 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-image-registry-private-configuration\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.078585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.078556 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-bound-sa-token\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.078725 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.078653 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s2qk\" (UniqueName: \"kubernetes.io/projected/8d3811b3-7e75-4345-b591-277c5aecb5fd-kube-api-access-9s2qk\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:17.078725 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.078676 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4whs\" (UniqueName: \"kubernetes.io/projected/8e1662ff-63f6-4f08-9e96-75f038878584-kube-api-access-w4whs\") pod \"ingress-canary-hdgh8\" (UID: \"8e1662ff-63f6-4f08-9e96-75f038878584\") " pod="openshift-ingress-canary/ingress-canary-hdgh8" Apr 20 13:31:17.079429 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.079411 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsbtj\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-kube-api-access-gsbtj\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.552160 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.552120 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert\") pod \"ingress-canary-hdgh8\" (UID: \"8e1662ff-63f6-4f08-9e96-75f038878584\") " pod="openshift-ingress-canary/ingress-canary-hdgh8" Apr 20 13:31:17.552331 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.552194 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:17.552331 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.552222 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:17.552331 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.552267 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 13:31:17.552331 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.552321 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert podName:8e1662ff-63f6-4f08-9e96-75f038878584 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:18.552306201 +0000 UTC m=+35.147997260 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert") pod "ingress-canary-hdgh8" (UID: "8e1662ff-63f6-4f08-9e96-75f038878584") : secret "canary-serving-cert" not found Apr 20 13:31:17.552537 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.552331 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 13:31:17.552537 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.552384 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 13:31:17.552537 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.552398 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls podName:8d3811b3-7e75-4345-b591-277c5aecb5fd nodeName:}" failed. No retries permitted until 2026-04-20 13:31:18.552384555 +0000 UTC m=+35.148075638 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls") pod "dns-default-p7fbq" (UID: "8d3811b3-7e75-4345-b591-277c5aecb5fd") : secret "dns-default-metrics-tls" not found Apr 20 13:31:17.552537 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.552399 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7889fdc99c-s6dfk: secret "image-registry-tls" not found Apr 20 13:31:17.552537 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.552442 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls podName:1314261e-ac1b-4ca4-8c20-11cf3ad0f281 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:18.552428704 +0000 UTC m=+35.148119772 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls") pod "image-registry-7889fdc99c-s6dfk" (UID: "1314261e-ac1b-4ca4-8c20-11cf3ad0f281") : secret "image-registry-tls" not found Apr 20 13:31:17.652985 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.652955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs\") pod \"network-metrics-daemon-5w9cl\" (UID: \"0de99a89-e8e5-491a-90c3-5c371ed6705f\") " pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:31:17.653119 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.653059 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 13:31:17.653119 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:17.653104 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs podName:0de99a89-e8e5-491a-90c3-5c371ed6705f nodeName:}" failed. No retries permitted until 2026-04-20 13:31:49.653092246 +0000 UTC m=+66.248783314 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs") pod "network-metrics-daemon-5w9cl" (UID: "0de99a89-e8e5-491a-90c3-5c371ed6705f") : secret "metrics-daemon-secret" not found Apr 20 13:31:17.753896 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.753869 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7gpw\" (UniqueName: \"kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw\") pod \"network-check-target-sb687\" (UID: \"ff4d6258-b35d-4f25-b171-7ce6265db518\") " pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:31:17.754037 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.753939 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret\") pod \"global-pull-secret-syncer-5w2mr\" (UID: \"3b76b883-7171-4efb-b2ac-fc558e9fdf79\") " pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:31:17.756188 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.756157 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3b76b883-7171-4efb-b2ac-fc558e9fdf79-original-pull-secret\") pod \"global-pull-secret-syncer-5w2mr\" (UID: \"3b76b883-7171-4efb-b2ac-fc558e9fdf79\") " pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:31:17.756300 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.756215 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7gpw\" (UniqueName: \"kubernetes.io/projected/ff4d6258-b35d-4f25-b171-7ce6265db518-kube-api-access-z7gpw\") pod \"network-check-target-sb687\" (UID: \"ff4d6258-b35d-4f25-b171-7ce6265db518\") " pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:31:17.887911 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.887883 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:31:17.895587 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:17.895563 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5w2mr" Apr 20 13:31:18.118774 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:18.118599 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5w2mr"] Apr 20 13:31:18.126053 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:18.126023 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sb687"] Apr 20 13:31:18.133884 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:18.133858 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g88jt" event={"ID":"29c60f5b-f12d-43ec-a794-f2abbe748308","Type":"ContainerStarted","Data":"7504fd23cbba1a4f5e0b8f76e605f692688e02409a78a5a63ba32eca9cd6e200"} Apr 20 13:31:18.167932 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:31:18.167906 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff4d6258_b35d_4f25_b171_7ce6265db518.slice/crio-04a51b02e73a507d3d94b22cc3ea544417f12b7ece3c8bc4255fb05403327eba WatchSource:0}: Error finding container 04a51b02e73a507d3d94b22cc3ea544417f12b7ece3c8bc4255fb05403327eba: Status 404 returned error can't find the container with id 04a51b02e73a507d3d94b22cc3ea544417f12b7ece3c8bc4255fb05403327eba Apr 20 13:31:18.168131 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:31:18.168110 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b76b883_7171_4efb_b2ac_fc558e9fdf79.slice/crio-209f667b49647a4f731fd6731ded38b890cfc197f7e3d38c35f9e9963b2ca12f WatchSource:0}: Error finding container 209f667b49647a4f731fd6731ded38b890cfc197f7e3d38c35f9e9963b2ca12f: Status 404 returned error can't find the container with id 209f667b49647a4f731fd6731ded38b890cfc197f7e3d38c35f9e9963b2ca12f Apr 20 13:31:18.560525 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:18.560489 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:18.560664 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:18.560533 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:18.560664 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:18.560583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert\") pod \"ingress-canary-hdgh8\" (UID: \"8e1662ff-63f6-4f08-9e96-75f038878584\") " pod="openshift-ingress-canary/ingress-canary-hdgh8" Apr 20 13:31:18.560664 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:18.560629 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 13:31:18.560664 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:18.560663 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 13:31:18.560790 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:18.560691 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 13:31:18.560790 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:18.560702 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert podName:8e1662ff-63f6-4f08-9e96-75f038878584 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:20.560689148 +0000 UTC m=+37.156380202 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert") pod "ingress-canary-hdgh8" (UID: "8e1662ff-63f6-4f08-9e96-75f038878584") : secret "canary-serving-cert" not found Apr 20 13:31:18.560790 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:18.560706 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7889fdc99c-s6dfk: secret "image-registry-tls" not found Apr 20 13:31:18.560790 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:18.560714 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls podName:8d3811b3-7e75-4345-b591-277c5aecb5fd nodeName:}" failed. No retries permitted until 2026-04-20 13:31:20.560708163 +0000 UTC m=+37.156399216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls") pod "dns-default-p7fbq" (UID: "8d3811b3-7e75-4345-b591-277c5aecb5fd") : secret "dns-default-metrics-tls" not found Apr 20 13:31:18.560790 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:18.560742 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls podName:1314261e-ac1b-4ca4-8c20-11cf3ad0f281 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:20.560730699 +0000 UTC m=+37.156421778 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls") pod "image-registry-7889fdc99c-s6dfk" (UID: "1314261e-ac1b-4ca4-8c20-11cf3ad0f281") : secret "image-registry-tls" not found Apr 20 13:31:19.137504 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:19.137329 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sb687" event={"ID":"ff4d6258-b35d-4f25-b171-7ce6265db518","Type":"ContainerStarted","Data":"04a51b02e73a507d3d94b22cc3ea544417f12b7ece3c8bc4255fb05403327eba"} Apr 20 13:31:19.140259 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:19.140225 2573 generic.go:358] "Generic (PLEG): container finished" podID="29c60f5b-f12d-43ec-a794-f2abbe748308" containerID="7504fd23cbba1a4f5e0b8f76e605f692688e02409a78a5a63ba32eca9cd6e200" exitCode=0 Apr 20 13:31:19.140400 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:19.140309 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g88jt" event={"ID":"29c60f5b-f12d-43ec-a794-f2abbe748308","Type":"ContainerDied","Data":"7504fd23cbba1a4f5e0b8f76e605f692688e02409a78a5a63ba32eca9cd6e200"} Apr 20 13:31:19.141613 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:19.141590 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5w2mr" event={"ID":"3b76b883-7171-4efb-b2ac-fc558e9fdf79","Type":"ContainerStarted","Data":"209f667b49647a4f731fd6731ded38b890cfc197f7e3d38c35f9e9963b2ca12f"} Apr 20 13:31:20.147919 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:20.147645 2573 generic.go:358] "Generic (PLEG): container finished" podID="29c60f5b-f12d-43ec-a794-f2abbe748308" containerID="7cdedd6145dc991b14e59c4bfe7383111cd0873de537559ab476cdcf7489ab49" exitCode=0 Apr 20 13:31:20.147919 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:20.147742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g88jt" event={"ID":"29c60f5b-f12d-43ec-a794-f2abbe748308","Type":"ContainerDied","Data":"7cdedd6145dc991b14e59c4bfe7383111cd0873de537559ab476cdcf7489ab49"} Apr 20 13:31:20.578230 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:20.578184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert\") pod \"ingress-canary-hdgh8\" (UID: \"8e1662ff-63f6-4f08-9e96-75f038878584\") " pod="openshift-ingress-canary/ingress-canary-hdgh8" Apr 20 13:31:20.578419 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:20.578252 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:20.578419 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:20.578283 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:20.578419 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:20.578407 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 13:31:20.578558 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:20.578422 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7889fdc99c-s6dfk: secret "image-registry-tls" not found Apr 20 13:31:20.578558 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:20.578442 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 13:31:20.578558 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:20.578473 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls podName:1314261e-ac1b-4ca4-8c20-11cf3ad0f281 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:24.578458074 +0000 UTC m=+41.174149128 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls") pod "image-registry-7889fdc99c-s6dfk" (UID: "1314261e-ac1b-4ca4-8c20-11cf3ad0f281") : secret "image-registry-tls" not found Apr 20 13:31:20.578558 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:20.578405 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 13:31:20.578558 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:20.578512 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls podName:8d3811b3-7e75-4345-b591-277c5aecb5fd nodeName:}" failed. No retries permitted until 2026-04-20 13:31:24.578491555 +0000 UTC m=+41.174182615 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls") pod "dns-default-p7fbq" (UID: "8d3811b3-7e75-4345-b591-277c5aecb5fd") : secret "dns-default-metrics-tls" not found Apr 20 13:31:20.578558 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:20.578530 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert podName:8e1662ff-63f6-4f08-9e96-75f038878584 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:24.578521078 +0000 UTC m=+41.174212138 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert") pod "ingress-canary-hdgh8" (UID: "8e1662ff-63f6-4f08-9e96-75f038878584") : secret "canary-serving-cert" not found Apr 20 13:31:22.626644 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.626611 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-rtbvg"] Apr 20 13:31:22.651136 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.651109 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj"] Apr 20 13:31:22.651271 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.651258 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:22.653640 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.653620 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 13:31:22.653952 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.653935 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 13:31:22.654318 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.654298 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 13:31:22.654504 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.654489 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 13:31:22.659042 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.659025 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-kzp5b\"" Apr 20 13:31:22.664883 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.664866 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 13:31:22.669873 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.669855 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-67666b9c78-x7c4n"] Apr 20 13:31:22.669996 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.669979 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" Apr 20 13:31:22.673558 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.673536 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 13:31:22.673806 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.673790 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 13:31:22.673883 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.673824 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 13:31:22.673883 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.673835 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 13:31:22.674021 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.673989 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-wrgb5\"" Apr 20 13:31:22.693811 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.693787 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj"] Apr 20 13:31:22.693892 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.693817 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-rtbvg"] Apr 20 13:31:22.693892 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.693830 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-67666b9c78-x7c4n"] Apr 20 13:31:22.693958 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.693906 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:22.695839 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.695821 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/689d9ed8-d3dd-4b84-a93f-cc84672538b6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-j5mhj\" (UID: \"689d9ed8-d3dd-4b84-a93f-cc84672538b6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" Apr 20 13:31:22.695920 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.695849 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/409c02a3-0a51-4fe6-813b-cc03f7497104-serving-cert\") pod \"console-operator-9d4b6777b-rtbvg\" (UID: \"409c02a3-0a51-4fe6-813b-cc03f7497104\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:22.695920 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.695889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-stats-auth\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:22.695920 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.695908 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/409c02a3-0a51-4fe6-813b-cc03f7497104-config\") pod \"console-operator-9d4b6777b-rtbvg\" (UID: \"409c02a3-0a51-4fe6-813b-cc03f7497104\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:22.696027 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.695964 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/409c02a3-0a51-4fe6-813b-cc03f7497104-trusted-ca\") pod \"console-operator-9d4b6777b-rtbvg\" (UID: \"409c02a3-0a51-4fe6-813b-cc03f7497104\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:22.696108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.696033 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52vpv\" (UniqueName: \"kubernetes.io/projected/689d9ed8-d3dd-4b84-a93f-cc84672538b6-kube-api-access-52vpv\") pod \"kube-storage-version-migrator-operator-6769c5d45-j5mhj\" (UID: \"689d9ed8-d3dd-4b84-a93f-cc84672538b6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" Apr 20 13:31:22.696108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.696054 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-default-certificate\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:22.696108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.696069 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hklpj\" (UniqueName: \"kubernetes.io/projected/a70ca6b1-f55d-4081-b09f-dd5454b489d3-kube-api-access-hklpj\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:22.696108 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.696096 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:22.696410 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.696111 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:22.696410 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.696194 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncrvg\" (UniqueName: \"kubernetes.io/projected/409c02a3-0a51-4fe6-813b-cc03f7497104-kube-api-access-ncrvg\") pod \"console-operator-9d4b6777b-rtbvg\" (UID: \"409c02a3-0a51-4fe6-813b-cc03f7497104\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:22.696410 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.696244 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/689d9ed8-d3dd-4b84-a93f-cc84672538b6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-j5mhj\" (UID: \"689d9ed8-d3dd-4b84-a93f-cc84672538b6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" Apr 20 13:31:22.697088 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.697059 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 13:31:22.697194 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.697093 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 13:31:22.697269 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.697193 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 13:31:22.697425 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.697413 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-vwqlb\"" Apr 20 13:31:22.697425 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.697419 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 13:31:22.697528 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.697417 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 13:31:22.697964 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.697950 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 13:31:22.719908 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.719886 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5"] Apr 20 13:31:22.736971 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.736949 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:22.738252 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.738232 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5"] Apr 20 13:31:22.739505 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.739359 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 13:31:22.739505 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.739383 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 13:31:22.739702 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.739684 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-f64q9\"" Apr 20 13:31:22.739762 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.739686 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 13:31:22.740759 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.740743 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 13:31:22.796782 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.796760 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/409c02a3-0a51-4fe6-813b-cc03f7497104-serving-cert\") pod \"console-operator-9d4b6777b-rtbvg\" (UID: \"409c02a3-0a51-4fe6-813b-cc03f7497104\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:22.796881 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.796785 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/409c02a3-0a51-4fe6-813b-cc03f7497104-config\") pod \"console-operator-9d4b6777b-rtbvg\" (UID: \"409c02a3-0a51-4fe6-813b-cc03f7497104\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:22.796881 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.796815 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:22.796881 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.796834 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/27fd6d03-d487-4763-a29e-c24f39dbeb32-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vtms5\" (UID: \"27fd6d03-d487-4763-a29e-c24f39dbeb32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:22.796881 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.796854 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52vpv\" (UniqueName: \"kubernetes.io/projected/689d9ed8-d3dd-4b84-a93f-cc84672538b6-kube-api-access-52vpv\") pod \"kube-storage-version-migrator-operator-6769c5d45-j5mhj\" (UID: \"689d9ed8-d3dd-4b84-a93f-cc84672538b6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" Apr 20 13:31:22.797037 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.796950 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hklpj\" (UniqueName: \"kubernetes.io/projected/a70ca6b1-f55d-4081-b09f-dd5454b489d3-kube-api-access-hklpj\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:22.797117 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:22.797101 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle podName:a70ca6b1-f55d-4081-b09f-dd5454b489d3 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:23.297083471 +0000 UTC m=+39.892774555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle") pod "router-default-67666b9c78-x7c4n" (UID: "a70ca6b1-f55d-4081-b09f-dd5454b489d3") : configmap references non-existent config key: service-ca.crt Apr 20 13:31:22.797215 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.797135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vtms5\" (UID: \"27fd6d03-d487-4763-a29e-c24f39dbeb32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:22.797215 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.797195 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-default-certificate\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:22.797314 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.797223 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:22.797314 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.797270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncrvg\" (UniqueName: \"kubernetes.io/projected/409c02a3-0a51-4fe6-813b-cc03f7497104-kube-api-access-ncrvg\") pod \"console-operator-9d4b6777b-rtbvg\" (UID: \"409c02a3-0a51-4fe6-813b-cc03f7497104\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:22.797314 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:22.797309 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 13:31:22.797314 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.797311 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/689d9ed8-d3dd-4b84-a93f-cc84672538b6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-j5mhj\" (UID: \"689d9ed8-d3dd-4b84-a93f-cc84672538b6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" Apr 20 13:31:22.797522 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.797339 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/689d9ed8-d3dd-4b84-a93f-cc84672538b6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-j5mhj\" (UID: \"689d9ed8-d3dd-4b84-a93f-cc84672538b6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" Apr 20 13:31:22.797522 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:22.797366 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs podName:a70ca6b1-f55d-4081-b09f-dd5454b489d3 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:23.297349923 +0000 UTC m=+39.893040994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs") pod "router-default-67666b9c78-x7c4n" (UID: "a70ca6b1-f55d-4081-b09f-dd5454b489d3") : secret "router-metrics-certs-default" not found Apr 20 13:31:22.797522 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.797412 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-stats-auth\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:22.797522 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.797442 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/409c02a3-0a51-4fe6-813b-cc03f7497104-trusted-ca\") pod \"console-operator-9d4b6777b-rtbvg\" (UID: \"409c02a3-0a51-4fe6-813b-cc03f7497104\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:22.797799 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.797529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld8f7\" (UniqueName: \"kubernetes.io/projected/27fd6d03-d487-4763-a29e-c24f39dbeb32-kube-api-access-ld8f7\") pod \"cluster-monitoring-operator-75587bd455-vtms5\" (UID: \"27fd6d03-d487-4763-a29e-c24f39dbeb32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:22.797799 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.797444 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/409c02a3-0a51-4fe6-813b-cc03f7497104-config\") pod \"console-operator-9d4b6777b-rtbvg\" (UID: \"409c02a3-0a51-4fe6-813b-cc03f7497104\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:22.798454 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.798224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/409c02a3-0a51-4fe6-813b-cc03f7497104-trusted-ca\") pod \"console-operator-9d4b6777b-rtbvg\" (UID: \"409c02a3-0a51-4fe6-813b-cc03f7497104\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:22.801014 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.800989 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/689d9ed8-d3dd-4b84-a93f-cc84672538b6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-j5mhj\" (UID: \"689d9ed8-d3dd-4b84-a93f-cc84672538b6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" Apr 20 13:31:22.801118 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.801089 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/409c02a3-0a51-4fe6-813b-cc03f7497104-serving-cert\") pod \"console-operator-9d4b6777b-rtbvg\" (UID: \"409c02a3-0a51-4fe6-813b-cc03f7497104\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:22.801201 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.801188 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-stats-auth\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:22.801257 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.801205 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-default-certificate\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:22.805801 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.805782 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/689d9ed8-d3dd-4b84-a93f-cc84672538b6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-j5mhj\" (UID: \"689d9ed8-d3dd-4b84-a93f-cc84672538b6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" Apr 20 13:31:22.805879 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.805811 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hklpj\" (UniqueName: \"kubernetes.io/projected/a70ca6b1-f55d-4081-b09f-dd5454b489d3-kube-api-access-hklpj\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:22.806089 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.806067 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52vpv\" (UniqueName: \"kubernetes.io/projected/689d9ed8-d3dd-4b84-a93f-cc84672538b6-kube-api-access-52vpv\") pod \"kube-storage-version-migrator-operator-6769c5d45-j5mhj\" (UID: \"689d9ed8-d3dd-4b84-a93f-cc84672538b6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" Apr 20 13:31:22.806172 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.806156 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncrvg\" (UniqueName: \"kubernetes.io/projected/409c02a3-0a51-4fe6-813b-cc03f7497104-kube-api-access-ncrvg\") pod \"console-operator-9d4b6777b-rtbvg\" (UID: \"409c02a3-0a51-4fe6-813b-cc03f7497104\") " pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:22.898352 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.898292 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/27fd6d03-d487-4763-a29e-c24f39dbeb32-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vtms5\" (UID: \"27fd6d03-d487-4763-a29e-c24f39dbeb32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:22.898352 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.898329 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vtms5\" (UID: \"27fd6d03-d487-4763-a29e-c24f39dbeb32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:22.898491 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.898392 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ld8f7\" (UniqueName: \"kubernetes.io/projected/27fd6d03-d487-4763-a29e-c24f39dbeb32-kube-api-access-ld8f7\") pod \"cluster-monitoring-operator-75587bd455-vtms5\" (UID: \"27fd6d03-d487-4763-a29e-c24f39dbeb32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:22.898533 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:22.898511 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 13:31:22.898614 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:22.898601 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls podName:27fd6d03-d487-4763-a29e-c24f39dbeb32 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:23.398581485 +0000 UTC m=+39.994272538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vtms5" (UID: "27fd6d03-d487-4763-a29e-c24f39dbeb32") : secret "cluster-monitoring-operator-tls" not found Apr 20 13:31:22.899031 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.899014 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/27fd6d03-d487-4763-a29e-c24f39dbeb32-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vtms5\" (UID: \"27fd6d03-d487-4763-a29e-c24f39dbeb32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:22.906896 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.906868 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld8f7\" (UniqueName: \"kubernetes.io/projected/27fd6d03-d487-4763-a29e-c24f39dbeb32-kube-api-access-ld8f7\") pod \"cluster-monitoring-operator-75587bd455-vtms5\" (UID: \"27fd6d03-d487-4763-a29e-c24f39dbeb32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:22.961042 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.961010 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:22.980771 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:22.980755 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" Apr 20 13:31:23.167369 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:23.167344 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-rtbvg"] Apr 20 13:31:23.176470 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:31:23.176434 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod409c02a3_0a51_4fe6_813b_cc03f7497104.slice/crio-af2f8c0b5b67e3c0bedbf09a123f9258960ad9a0239dfe3c4c3ebcf64b57a67e WatchSource:0}: Error finding container af2f8c0b5b67e3c0bedbf09a123f9258960ad9a0239dfe3c4c3ebcf64b57a67e: Status 404 returned error can't find the container with id af2f8c0b5b67e3c0bedbf09a123f9258960ad9a0239dfe3c4c3ebcf64b57a67e Apr 20 13:31:23.183568 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:23.183547 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj"] Apr 20 13:31:23.187329 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:31:23.187309 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod689d9ed8_d3dd_4b84_a93f_cc84672538b6.slice/crio-b3b9bcfea9c25d4545b8951e9154a422e1f460575727e94ad284395374e4a67e WatchSource:0}: Error finding container b3b9bcfea9c25d4545b8951e9154a422e1f460575727e94ad284395374e4a67e: Status 404 returned error can't find the container with id b3b9bcfea9c25d4545b8951e9154a422e1f460575727e94ad284395374e4a67e Apr 20 13:31:23.302477 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:23.302446 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:23.302630 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:23.302489 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:23.302630 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:23.302582 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 13:31:23.302709 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:23.302631 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle podName:a70ca6b1-f55d-4081-b09f-dd5454b489d3 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:24.30261036 +0000 UTC m=+40.898301417 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle") pod "router-default-67666b9c78-x7c4n" (UID: "a70ca6b1-f55d-4081-b09f-dd5454b489d3") : configmap references non-existent config key: service-ca.crt Apr 20 13:31:23.302709 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:23.302662 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs podName:a70ca6b1-f55d-4081-b09f-dd5454b489d3 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:24.30265165 +0000 UTC m=+40.898342709 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs") pod "router-default-67666b9c78-x7c4n" (UID: "a70ca6b1-f55d-4081-b09f-dd5454b489d3") : secret "router-metrics-certs-default" not found Apr 20 13:31:23.403602 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:23.403567 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vtms5\" (UID: \"27fd6d03-d487-4763-a29e-c24f39dbeb32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:23.403754 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:23.403716 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 13:31:23.403830 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:23.403792 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls podName:27fd6d03-d487-4763-a29e-c24f39dbeb32 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:24.403770118 +0000 UTC m=+40.999461172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vtms5" (UID: "27fd6d03-d487-4763-a29e-c24f39dbeb32") : secret "cluster-monitoring-operator-tls" not found Apr 20 13:31:24.161120 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:24.161081 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g88jt" event={"ID":"29c60f5b-f12d-43ec-a794-f2abbe748308","Type":"ContainerStarted","Data":"6870eeee9457d8da890bc3955a148c10d03d6f6b8c16ae7eb3a5cbb254577281"} Apr 20 13:31:24.162977 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:24.162945 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5w2mr" event={"ID":"3b76b883-7171-4efb-b2ac-fc558e9fdf79","Type":"ContainerStarted","Data":"db3f8dc135de06339a89bd1c8b2894814a8352eca1525ce0791d65c3360e7e0d"} Apr 20 13:31:24.164320 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:24.164296 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" event={"ID":"689d9ed8-d3dd-4b84-a93f-cc84672538b6","Type":"ContainerStarted","Data":"b3b9bcfea9c25d4545b8951e9154a422e1f460575727e94ad284395374e4a67e"} Apr 20 13:31:24.165713 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:24.165684 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sb687" event={"ID":"ff4d6258-b35d-4f25-b171-7ce6265db518","Type":"ContainerStarted","Data":"8ef7e0ccceb1b15a19b1b71c1dbe21410a2e513ebb5b7069908e2ebf4ff099b5"} Apr 20 13:31:24.165951 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:24.165918 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:31:24.167014 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:24.166992 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" event={"ID":"409c02a3-0a51-4fe6-813b-cc03f7497104","Type":"ContainerStarted","Data":"af2f8c0b5b67e3c0bedbf09a123f9258960ad9a0239dfe3c4c3ebcf64b57a67e"} Apr 20 13:31:24.184137 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:24.184068 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g88jt" podStartSLOduration=8.982932184 podStartE2EDuration="40.184052744s" podCreationTimestamp="2026-04-20 13:30:44 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.615295606 +0000 UTC m=+3.210986667" lastFinishedPulling="2026-04-20 13:31:17.816416171 +0000 UTC m=+34.412107227" observedRunningTime="2026-04-20 13:31:24.183232225 +0000 UTC m=+40.778923296" watchObservedRunningTime="2026-04-20 13:31:24.184052744 +0000 UTC m=+40.779743808" Apr 20 13:31:24.198938 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:24.198891 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-sb687" podStartSLOduration=35.364835522 podStartE2EDuration="40.198882603s" podCreationTimestamp="2026-04-20 13:30:44 +0000 UTC" firstStartedPulling="2026-04-20 13:31:18.169704174 +0000 UTC m=+34.765395242" lastFinishedPulling="2026-04-20 13:31:23.003751255 +0000 UTC m=+39.599442323" observedRunningTime="2026-04-20 13:31:24.197961291 +0000 UTC m=+40.793652368" watchObservedRunningTime="2026-04-20 13:31:24.198882603 +0000 UTC m=+40.794573680" Apr 20 13:31:24.213969 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:24.213922 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5w2mr" podStartSLOduration=34.360818006 podStartE2EDuration="39.213908895s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="2026-04-20 13:31:18.169657424 +0000 UTC m=+34.765348478" lastFinishedPulling="2026-04-20 13:31:23.022748295 +0000 UTC m=+39.618439367" observedRunningTime="2026-04-20 13:31:24.212938458 +0000 UTC m=+40.808629535" watchObservedRunningTime="2026-04-20 13:31:24.213908895 +0000 UTC m=+40.809599971" Apr 20 13:31:24.312779 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:24.312712 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:24.312953 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:24.312855 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:24.312953 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:24.312899 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 13:31:24.313071 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:24.312981 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs podName:a70ca6b1-f55d-4081-b09f-dd5454b489d3 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:26.31295993 +0000 UTC m=+42.908651004 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs") pod "router-default-67666b9c78-x7c4n" (UID: "a70ca6b1-f55d-4081-b09f-dd5454b489d3") : secret "router-metrics-certs-default" not found Apr 20 13:31:24.313071 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:24.313003 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle podName:a70ca6b1-f55d-4081-b09f-dd5454b489d3 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:26.312992462 +0000 UTC m=+42.908683537 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle") pod "router-default-67666b9c78-x7c4n" (UID: "a70ca6b1-f55d-4081-b09f-dd5454b489d3") : configmap references non-existent config key: service-ca.crt Apr 20 13:31:24.414314 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:24.414219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vtms5\" (UID: \"27fd6d03-d487-4763-a29e-c24f39dbeb32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:24.414809 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:24.414777 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 13:31:24.414940 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:24.414868 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls podName:27fd6d03-d487-4763-a29e-c24f39dbeb32 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:26.414846896 +0000 UTC m=+43.010537969 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vtms5" (UID: "27fd6d03-d487-4763-a29e-c24f39dbeb32") : secret "cluster-monitoring-operator-tls" not found Apr 20 13:31:24.615852 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:24.615818 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert\") pod \"ingress-canary-hdgh8\" (UID: \"8e1662ff-63f6-4f08-9e96-75f038878584\") " pod="openshift-ingress-canary/ingress-canary-hdgh8" Apr 20 13:31:24.615852 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:24.615855 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:24.616070 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:24.615980 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 13:31:24.616070 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:24.615999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:24.616070 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:24.616055 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert podName:8e1662ff-63f6-4f08-9e96-75f038878584 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:32.616035978 +0000 UTC m=+49.211727053 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert") pod "ingress-canary-hdgh8" (UID: "8e1662ff-63f6-4f08-9e96-75f038878584") : secret "canary-serving-cert" not found Apr 20 13:31:24.616221 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:24.616097 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 13:31:24.616221 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:24.616110 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7889fdc99c-s6dfk: secret "image-registry-tls" not found Apr 20 13:31:24.616221 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:24.616097 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 13:31:24.616221 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:24.616168 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls podName:1314261e-ac1b-4ca4-8c20-11cf3ad0f281 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:32.616156392 +0000 UTC m=+49.211847450 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls") pod "image-registry-7889fdc99c-s6dfk" (UID: "1314261e-ac1b-4ca4-8c20-11cf3ad0f281") : secret "image-registry-tls" not found Apr 20 13:31:24.616366 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:24.616246 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls podName:8d3811b3-7e75-4345-b591-277c5aecb5fd nodeName:}" failed. No retries permitted until 2026-04-20 13:31:32.616222595 +0000 UTC m=+49.211913662 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls") pod "dns-default-p7fbq" (UID: "8d3811b3-7e75-4345-b591-277c5aecb5fd") : secret "dns-default-metrics-tls" not found Apr 20 13:31:26.330000 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:26.329968 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:26.330398 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:26.330014 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:26.330398 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:26.330152 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle podName:a70ca6b1-f55d-4081-b09f-dd5454b489d3 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:30.330122973 +0000 UTC m=+46.925814040 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle") pod "router-default-67666b9c78-x7c4n" (UID: "a70ca6b1-f55d-4081-b09f-dd5454b489d3") : configmap references non-existent config key: service-ca.crt Apr 20 13:31:26.330398 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:26.330215 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 13:31:26.330398 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:26.330267 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs podName:a70ca6b1-f55d-4081-b09f-dd5454b489d3 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:30.330255613 +0000 UTC m=+46.925946701 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs") pod "router-default-67666b9c78-x7c4n" (UID: "a70ca6b1-f55d-4081-b09f-dd5454b489d3") : secret "router-metrics-certs-default" not found Apr 20 13:31:26.431169 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:26.431130 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vtms5\" (UID: \"27fd6d03-d487-4763-a29e-c24f39dbeb32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:26.431273 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:26.431243 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 13:31:26.431321 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:26.431303 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls podName:27fd6d03-d487-4763-a29e-c24f39dbeb32 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:30.431288899 +0000 UTC m=+47.026979952 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vtms5" (UID: "27fd6d03-d487-4763-a29e-c24f39dbeb32") : secret "cluster-monitoring-operator-tls" not found Apr 20 13:31:27.174196 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:27.174159 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" event={"ID":"689d9ed8-d3dd-4b84-a93f-cc84672538b6","Type":"ContainerStarted","Data":"99f9925b12a209c564b097862958b555070ddf60a27afab91a693999b6305790"} Apr 20 13:31:27.175533 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:27.175516 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/0.log" Apr 20 13:31:27.175607 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:27.175549 2573 generic.go:358] "Generic (PLEG): container finished" podID="409c02a3-0a51-4fe6-813b-cc03f7497104" containerID="e900e9c4e776b0745ec05c45b4d5fa2571ce6d424d169e283ac0a3b45addd4ce" exitCode=255 Apr 20 13:31:27.175607 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:27.175573 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" event={"ID":"409c02a3-0a51-4fe6-813b-cc03f7497104","Type":"ContainerDied","Data":"e900e9c4e776b0745ec05c45b4d5fa2571ce6d424d169e283ac0a3b45addd4ce"} Apr 20 13:31:27.175790 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:27.175776 2573 scope.go:117] "RemoveContainer" containerID="e900e9c4e776b0745ec05c45b4d5fa2571ce6d424d169e283ac0a3b45addd4ce" Apr 20 13:31:27.190334 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:27.190296 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" podStartSLOduration=2.036149108 podStartE2EDuration="5.190284406s" podCreationTimestamp="2026-04-20 13:31:22 +0000 UTC" firstStartedPulling="2026-04-20 13:31:23.188940569 +0000 UTC m=+39.784631623" lastFinishedPulling="2026-04-20 13:31:26.343075852 +0000 UTC m=+42.938766921" observedRunningTime="2026-04-20 13:31:27.189509781 +0000 UTC m=+43.785200859" watchObservedRunningTime="2026-04-20 13:31:27.190284406 +0000 UTC m=+43.785975482" Apr 20 13:31:28.179949 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.179921 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/1.log" Apr 20 13:31:28.180402 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.180284 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/0.log" Apr 20 13:31:28.180402 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.180314 2573 generic.go:358] "Generic (PLEG): container finished" podID="409c02a3-0a51-4fe6-813b-cc03f7497104" containerID="66c9c5e31ceaa01741886b7054e6feb95a316f84f11c8f810e63a48ca8866244" exitCode=255 Apr 20 13:31:28.180499 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.180410 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" event={"ID":"409c02a3-0a51-4fe6-813b-cc03f7497104","Type":"ContainerDied","Data":"66c9c5e31ceaa01741886b7054e6feb95a316f84f11c8f810e63a48ca8866244"} Apr 20 13:31:28.180499 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.180454 2573 scope.go:117] "RemoveContainer" containerID="e900e9c4e776b0745ec05c45b4d5fa2571ce6d424d169e283ac0a3b45addd4ce" Apr 20 13:31:28.180734 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.180719 2573 scope.go:117] "RemoveContainer" containerID="66c9c5e31ceaa01741886b7054e6feb95a316f84f11c8f810e63a48ca8866244" Apr 20 13:31:28.180936 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:28.180919 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-rtbvg_openshift-console-operator(409c02a3-0a51-4fe6-813b-cc03f7497104)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" podUID="409c02a3-0a51-4fe6-813b-cc03f7497104" Apr 20 13:31:28.608929 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.608892 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-kl8nw"] Apr 20 13:31:28.613106 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.613084 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kl8nw" Apr 20 13:31:28.615646 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.615621 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 13:31:28.615759 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.615635 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 13:31:28.616394 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.616378 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-xs7gp\"" Apr 20 13:31:28.620199 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.620173 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-kl8nw"] Apr 20 13:31:28.648828 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.648798 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m755t\" (UniqueName: \"kubernetes.io/projected/c643cd33-a7a0-4649-8ea1-1c6cc7ad1130-kube-api-access-m755t\") pod \"migrator-74bb7799d9-kl8nw\" (UID: \"c643cd33-a7a0-4649-8ea1-1c6cc7ad1130\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kl8nw" Apr 20 13:31:28.749357 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.749325 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m755t\" (UniqueName: \"kubernetes.io/projected/c643cd33-a7a0-4649-8ea1-1c6cc7ad1130-kube-api-access-m755t\") pod \"migrator-74bb7799d9-kl8nw\" (UID: \"c643cd33-a7a0-4649-8ea1-1c6cc7ad1130\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kl8nw" Apr 20 13:31:28.757792 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.757769 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m755t\" (UniqueName: \"kubernetes.io/projected/c643cd33-a7a0-4649-8ea1-1c6cc7ad1130-kube-api-access-m755t\") pod \"migrator-74bb7799d9-kl8nw\" (UID: \"c643cd33-a7a0-4649-8ea1-1c6cc7ad1130\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kl8nw" Apr 20 13:31:28.922344 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:28.922247 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kl8nw" Apr 20 13:31:29.035596 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:29.035566 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-kl8nw"] Apr 20 13:31:29.038840 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:31:29.038811 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc643cd33_a7a0_4649_8ea1_1c6cc7ad1130.slice/crio-5f15ad01731ba124363c207fe208d5b4227a5a377618d16baedf83a0a6f9f16d WatchSource:0}: Error finding container 5f15ad01731ba124363c207fe208d5b4227a5a377618d16baedf83a0a6f9f16d: Status 404 returned error can't find the container with id 5f15ad01731ba124363c207fe208d5b4227a5a377618d16baedf83a0a6f9f16d Apr 20 13:31:29.183685 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:29.183591 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kl8nw" event={"ID":"c643cd33-a7a0-4649-8ea1-1c6cc7ad1130","Type":"ContainerStarted","Data":"5f15ad01731ba124363c207fe208d5b4227a5a377618d16baedf83a0a6f9f16d"} Apr 20 13:31:29.185026 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:29.185009 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/1.log" Apr 20 13:31:29.185396 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:29.185370 2573 scope.go:117] "RemoveContainer" containerID="66c9c5e31ceaa01741886b7054e6feb95a316f84f11c8f810e63a48ca8866244" Apr 20 13:31:29.185589 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:29.185570 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-rtbvg_openshift-console-operator(409c02a3-0a51-4fe6-813b-cc03f7497104)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" podUID="409c02a3-0a51-4fe6-813b-cc03f7497104" Apr 20 13:31:30.360923 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:30.360899 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:30.361250 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:30.360944 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:30.361250 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:30.361053 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle podName:a70ca6b1-f55d-4081-b09f-dd5454b489d3 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:38.361035042 +0000 UTC m=+54.956726103 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle") pod "router-default-67666b9c78-x7c4n" (UID: "a70ca6b1-f55d-4081-b09f-dd5454b489d3") : configmap references non-existent config key: service-ca.crt Apr 20 13:31:30.361250 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:30.361062 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 13:31:30.361250 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:30.361108 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs podName:a70ca6b1-f55d-4081-b09f-dd5454b489d3 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:38.361094682 +0000 UTC m=+54.956785739 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs") pod "router-default-67666b9c78-x7c4n" (UID: "a70ca6b1-f55d-4081-b09f-dd5454b489d3") : secret "router-metrics-certs-default" not found Apr 20 13:31:30.461308 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:30.461278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vtms5\" (UID: \"27fd6d03-d487-4763-a29e-c24f39dbeb32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:30.461442 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:30.461425 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 13:31:30.461492 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:30.461482 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls podName:27fd6d03-d487-4763-a29e-c24f39dbeb32 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:38.461468214 +0000 UTC m=+55.057159301 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vtms5" (UID: "27fd6d03-d487-4763-a29e-c24f39dbeb32") : secret "cluster-monitoring-operator-tls" not found Apr 20 13:31:31.191655 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:31.191622 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kl8nw" event={"ID":"c643cd33-a7a0-4649-8ea1-1c6cc7ad1130","Type":"ContainerStarted","Data":"a7e852cd88656fb0cf8d3102a6ff901220cbb9415b94ddd1dac2cfc124ea4d96"} Apr 20 13:31:31.191800 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:31.191663 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kl8nw" event={"ID":"c643cd33-a7a0-4649-8ea1-1c6cc7ad1130","Type":"ContainerStarted","Data":"01fb20be276ee48f3c2a924616c317339056432c410ce228aad3b569863cefdf"} Apr 20 13:31:31.208371 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:31.208325 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kl8nw" podStartSLOduration=1.9301482060000001 podStartE2EDuration="3.208312644s" podCreationTimestamp="2026-04-20 13:31:28 +0000 UTC" firstStartedPulling="2026-04-20 13:31:29.041316803 +0000 UTC m=+45.637007857" lastFinishedPulling="2026-04-20 13:31:30.31948124 +0000 UTC m=+46.915172295" observedRunningTime="2026-04-20 13:31:31.20699041 +0000 UTC m=+47.802681485" watchObservedRunningTime="2026-04-20 13:31:31.208312644 +0000 UTC m=+47.804003784" Apr 20 13:31:32.679161 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:32.679090 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:32.679538 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:32.679239 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert\") pod \"ingress-canary-hdgh8\" (UID: \"8e1662ff-63f6-4f08-9e96-75f038878584\") " pod="openshift-ingress-canary/ingress-canary-hdgh8" Apr 20 13:31:32.679538 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:32.679259 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 13:31:32.679538 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:32.679281 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7889fdc99c-s6dfk: secret "image-registry-tls" not found Apr 20 13:31:32.679538 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:32.679340 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls podName:1314261e-ac1b-4ca4-8c20-11cf3ad0f281 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:48.67932082 +0000 UTC m=+65.275011891 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls") pod "image-registry-7889fdc99c-s6dfk" (UID: "1314261e-ac1b-4ca4-8c20-11cf3ad0f281") : secret "image-registry-tls" not found Apr 20 13:31:32.679538 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:32.679346 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 13:31:32.679538 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:32.679365 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 13:31:32.679538 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:32.679386 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls podName:8d3811b3-7e75-4345-b591-277c5aecb5fd nodeName:}" failed. No retries permitted until 2026-04-20 13:31:48.679375125 +0000 UTC m=+65.275066180 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls") pod "dns-default-p7fbq" (UID: "8d3811b3-7e75-4345-b591-277c5aecb5fd") : secret "dns-default-metrics-tls" not found Apr 20 13:31:32.679538 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:32.679431 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert podName:8e1662ff-63f6-4f08-9e96-75f038878584 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:48.679419137 +0000 UTC m=+65.275110190 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert") pod "ingress-canary-hdgh8" (UID: "8e1662ff-63f6-4f08-9e96-75f038878584") : secret "canary-serving-cert" not found Apr 20 13:31:32.679538 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:32.679267 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:32.715401 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:32.715372 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4qdsh_41710337-4f82-4bb8-abe7-f7a5cc3d9802/dns-node-resolver/0.log" Apr 20 13:31:32.961431 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:32.961337 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:32.961431 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:32.961385 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:32.961803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:32.961785 2573 scope.go:117] "RemoveContainer" containerID="66c9c5e31ceaa01741886b7054e6feb95a316f84f11c8f810e63a48ca8866244" Apr 20 13:31:32.962026 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:32.962006 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-rtbvg_openshift-console-operator(409c02a3-0a51-4fe6-813b-cc03f7497104)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" podUID="409c02a3-0a51-4fe6-813b-cc03f7497104" Apr 20 13:31:33.058370 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.058336 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-h754j"] Apr 20 13:31:33.080751 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.080719 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-h754j"] Apr 20 13:31:33.080887 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.080825 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.083340 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.083317 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 13:31:33.083476 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.083317 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vbdzl\"" Apr 20 13:31:33.083476 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.083320 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 13:31:33.083476 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.083409 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 13:31:33.083617 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.083320 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 13:31:33.182015 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.181982 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/824d27e0-a488-4bf8-badb-5a72756a911c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.182211 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.182038 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/824d27e0-a488-4bf8-badb-5a72756a911c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.182211 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.182113 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/824d27e0-a488-4bf8-badb-5a72756a911c-crio-socket\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.182211 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.182179 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/824d27e0-a488-4bf8-badb-5a72756a911c-data-volume\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.182211 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.182210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rmq6\" (UniqueName: \"kubernetes.io/projected/824d27e0-a488-4bf8-badb-5a72756a911c-kube-api-access-2rmq6\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.282878 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.282788 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/824d27e0-a488-4bf8-badb-5a72756a911c-crio-socket\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.282878 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.282828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/824d27e0-a488-4bf8-badb-5a72756a911c-data-volume\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.282878 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.282846 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rmq6\" (UniqueName: \"kubernetes.io/projected/824d27e0-a488-4bf8-badb-5a72756a911c-kube-api-access-2rmq6\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.283133 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.282964 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/824d27e0-a488-4bf8-badb-5a72756a911c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.283133 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.283003 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/824d27e0-a488-4bf8-badb-5a72756a911c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.283133 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.283037 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/824d27e0-a488-4bf8-badb-5a72756a911c-crio-socket\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.283255 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:33.283166 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 13:31:33.283255 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.283189 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/824d27e0-a488-4bf8-badb-5a72756a911c-data-volume\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.283319 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:33.283263 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/824d27e0-a488-4bf8-badb-5a72756a911c-insights-runtime-extractor-tls podName:824d27e0-a488-4bf8-badb-5a72756a911c nodeName:}" failed. No retries permitted until 2026-04-20 13:31:33.783242674 +0000 UTC m=+50.378933746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/824d27e0-a488-4bf8-badb-5a72756a911c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-h754j" (UID: "824d27e0-a488-4bf8-badb-5a72756a911c") : secret "insights-runtime-extractor-tls" not found Apr 20 13:31:33.283578 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.283562 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/824d27e0-a488-4bf8-badb-5a72756a911c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.292596 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.292573 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rmq6\" (UniqueName: \"kubernetes.io/projected/824d27e0-a488-4bf8-badb-5a72756a911c-kube-api-access-2rmq6\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.787209 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.787173 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/824d27e0-a488-4bf8-badb-5a72756a911c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:33.787645 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:33.787301 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 13:31:33.787645 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:33.787367 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/824d27e0-a488-4bf8-badb-5a72756a911c-insights-runtime-extractor-tls podName:824d27e0-a488-4bf8-badb-5a72756a911c nodeName:}" failed. No retries permitted until 2026-04-20 13:31:34.78735092 +0000 UTC m=+51.383041979 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/824d27e0-a488-4bf8-badb-5a72756a911c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-h754j" (UID: "824d27e0-a488-4bf8-badb-5a72756a911c") : secret "insights-runtime-extractor-tls" not found Apr 20 13:31:33.916334 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:33.916307 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kmm7p_5e473fb6-5d6c-47e5-9f17-d87b134e316e/node-ca/0.log" Apr 20 13:31:34.796032 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:34.795997 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/824d27e0-a488-4bf8-badb-5a72756a911c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:34.796496 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:34.796194 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 13:31:34.796496 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:34.796276 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/824d27e0-a488-4bf8-badb-5a72756a911c-insights-runtime-extractor-tls podName:824d27e0-a488-4bf8-badb-5a72756a911c nodeName:}" failed. No retries permitted until 2026-04-20 13:31:36.79625462 +0000 UTC m=+53.391945676 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/824d27e0-a488-4bf8-badb-5a72756a911c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-h754j" (UID: "824d27e0-a488-4bf8-badb-5a72756a911c") : secret "insights-runtime-extractor-tls" not found Apr 20 13:31:34.914437 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:34.914409 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-kl8nw_c643cd33-a7a0-4649-8ea1-1c6cc7ad1130/migrator/0.log" Apr 20 13:31:35.114739 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:35.114672 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-kl8nw_c643cd33-a7a0-4649-8ea1-1c6cc7ad1130/graceful-termination/0.log" Apr 20 13:31:35.315549 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:35.315510 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-j5mhj_689d9ed8-d3dd-4b84-a93f-cc84672538b6/kube-storage-version-migrator-operator/0.log" Apr 20 13:31:36.811814 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:36.811780 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/824d27e0-a488-4bf8-badb-5a72756a911c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:36.812216 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:36.811916 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 13:31:36.812216 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:36.811977 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/824d27e0-a488-4bf8-badb-5a72756a911c-insights-runtime-extractor-tls podName:824d27e0-a488-4bf8-badb-5a72756a911c nodeName:}" failed. No retries permitted until 2026-04-20 13:31:40.811961815 +0000 UTC m=+57.407652873 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/824d27e0-a488-4bf8-badb-5a72756a911c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-h754j" (UID: "824d27e0-a488-4bf8-badb-5a72756a911c") : secret "insights-runtime-extractor-tls" not found Apr 20 13:31:38.423854 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:38.423814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:38.424328 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:38.423949 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 13:31:38.424328 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:38.423966 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:38.424328 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:38.424005 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs podName:a70ca6b1-f55d-4081-b09f-dd5454b489d3 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:54.423992058 +0000 UTC m=+71.019683125 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs") pod "router-default-67666b9c78-x7c4n" (UID: "a70ca6b1-f55d-4081-b09f-dd5454b489d3") : secret "router-metrics-certs-default" not found Apr 20 13:31:38.424328 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:38.424102 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle podName:a70ca6b1-f55d-4081-b09f-dd5454b489d3 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:54.424086037 +0000 UTC m=+71.019777105 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle") pod "router-default-67666b9c78-x7c4n" (UID: "a70ca6b1-f55d-4081-b09f-dd5454b489d3") : configmap references non-existent config key: service-ca.crt Apr 20 13:31:38.525056 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:38.525014 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vtms5\" (UID: \"27fd6d03-d487-4763-a29e-c24f39dbeb32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:38.525244 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:38.525135 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 13:31:38.525244 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:38.525211 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls podName:27fd6d03-d487-4763-a29e-c24f39dbeb32 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:54.525194489 +0000 UTC m=+71.120885544 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vtms5" (UID: "27fd6d03-d487-4763-a29e-c24f39dbeb32") : secret "cluster-monitoring-operator-tls" not found Apr 20 13:31:40.844346 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:40.844313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/824d27e0-a488-4bf8-badb-5a72756a911c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:40.846975 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:40.846946 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/824d27e0-a488-4bf8-badb-5a72756a911c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h754j\" (UID: \"824d27e0-a488-4bf8-badb-5a72756a911c\") " pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:40.889452 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:40.889419 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-h754j" Apr 20 13:31:41.006882 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:41.006844 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-h754j"] Apr 20 13:31:41.010475 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:31:41.010445 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod824d27e0_a488_4bf8_badb_5a72756a911c.slice/crio-33dd91c2b37410d3868f08ff41d9f4f791da246076f90b602c9a259ea2042697 WatchSource:0}: Error finding container 33dd91c2b37410d3868f08ff41d9f4f791da246076f90b602c9a259ea2042697: Status 404 returned error can't find the container with id 33dd91c2b37410d3868f08ff41d9f4f791da246076f90b602c9a259ea2042697 Apr 20 13:31:41.219265 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:41.219180 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h754j" event={"ID":"824d27e0-a488-4bf8-badb-5a72756a911c","Type":"ContainerStarted","Data":"0cd9994a47b3a1089cd4374630f41de49dd0e935ee7a3c11b797cee6a8d8c699"} Apr 20 13:31:41.219265 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:41.219218 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h754j" event={"ID":"824d27e0-a488-4bf8-badb-5a72756a911c","Type":"ContainerStarted","Data":"33dd91c2b37410d3868f08ff41d9f4f791da246076f90b602c9a259ea2042697"} Apr 20 13:31:42.223866 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:42.223824 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h754j" event={"ID":"824d27e0-a488-4bf8-badb-5a72756a911c","Type":"ContainerStarted","Data":"7af9993d7ced9c3a13a13ac7c939466816514cb7a65af069f0d1f67ff8c84812"} Apr 20 13:31:43.134476 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:43.134294 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-drksq" Apr 20 13:31:44.229784 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:44.229747 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h754j" event={"ID":"824d27e0-a488-4bf8-badb-5a72756a911c","Type":"ContainerStarted","Data":"a5c460de4103a97448ef9a08cc0f2bce0c826e6160dfceb1e02eaf91805e6a25"} Apr 20 13:31:44.251677 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:44.251632 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-h754j" podStartSLOduration=9.086287533 podStartE2EDuration="11.25161819s" podCreationTimestamp="2026-04-20 13:31:33 +0000 UTC" firstStartedPulling="2026-04-20 13:31:41.067537769 +0000 UTC m=+57.663228823" lastFinishedPulling="2026-04-20 13:31:43.232868426 +0000 UTC m=+59.828559480" observedRunningTime="2026-04-20 13:31:44.250859062 +0000 UTC m=+60.846550174" watchObservedRunningTime="2026-04-20 13:31:44.25161819 +0000 UTC m=+60.847309265" Apr 20 13:31:44.976552 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:44.976517 2573 scope.go:117] "RemoveContainer" containerID="66c9c5e31ceaa01741886b7054e6feb95a316f84f11c8f810e63a48ca8866244" Apr 20 13:31:45.234350 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:45.234280 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 13:31:45.234787 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:45.234670 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/1.log" Apr 20 13:31:45.234787 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:45.234713 2573 generic.go:358] "Generic (PLEG): container finished" podID="409c02a3-0a51-4fe6-813b-cc03f7497104" containerID="59cca0501ca7bd97052289b06681a289fedf2533dea710fa46ff3d17a6555d29" exitCode=255 Apr 20 13:31:45.234900 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:45.234782 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" event={"ID":"409c02a3-0a51-4fe6-813b-cc03f7497104","Type":"ContainerDied","Data":"59cca0501ca7bd97052289b06681a289fedf2533dea710fa46ff3d17a6555d29"} Apr 20 13:31:45.234900 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:45.234826 2573 scope.go:117] "RemoveContainer" containerID="66c9c5e31ceaa01741886b7054e6feb95a316f84f11c8f810e63a48ca8866244" Apr 20 13:31:45.235173 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:45.235157 2573 scope.go:117] "RemoveContainer" containerID="59cca0501ca7bd97052289b06681a289fedf2533dea710fa46ff3d17a6555d29" Apr 20 13:31:45.235395 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:45.235370 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-rtbvg_openshift-console-operator(409c02a3-0a51-4fe6-813b-cc03f7497104)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" podUID="409c02a3-0a51-4fe6-813b-cc03f7497104" Apr 20 13:31:46.241384 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:46.241354 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 13:31:48.711326 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:48.711284 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:48.711808 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:48.711390 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert\") pod \"ingress-canary-hdgh8\" (UID: \"8e1662ff-63f6-4f08-9e96-75f038878584\") " pod="openshift-ingress-canary/ingress-canary-hdgh8" Apr 20 13:31:48.711808 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:48.711426 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:48.713657 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:48.713623 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d3811b3-7e75-4345-b591-277c5aecb5fd-metrics-tls\") pod \"dns-default-p7fbq\" (UID: \"8d3811b3-7e75-4345-b591-277c5aecb5fd\") " pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:48.713792 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:48.713776 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls\") pod \"image-registry-7889fdc99c-s6dfk\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:48.713860 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:48.713776 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e1662ff-63f6-4f08-9e96-75f038878584-cert\") pod \"ingress-canary-hdgh8\" (UID: \"8e1662ff-63f6-4f08-9e96-75f038878584\") " pod="openshift-ingress-canary/ingress-canary-hdgh8" Apr 20 13:31:48.913925 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:48.913898 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xw967\"" Apr 20 13:31:48.921934 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:48.921913 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:48.926061 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:48.926039 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-c8wr5\"" Apr 20 13:31:48.932088 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:48.932067 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mqqqc\"" Apr 20 13:31:48.934725 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:48.934707 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:48.940409 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:48.940388 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hdgh8" Apr 20 13:31:49.077593 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:49.077562 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7889fdc99c-s6dfk"] Apr 20 13:31:49.080603 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:49.080580 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p7fbq"] Apr 20 13:31:49.081689 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:31:49.081661 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1314261e_ac1b_4ca4_8c20_11cf3ad0f281.slice/crio-c6795ff156e96bd0a5a4bf014dd525145acd4b07264c1fce864aef824104d860 WatchSource:0}: Error finding container c6795ff156e96bd0a5a4bf014dd525145acd4b07264c1fce864aef824104d860: Status 404 returned error can't find the container with id c6795ff156e96bd0a5a4bf014dd525145acd4b07264c1fce864aef824104d860 Apr 20 13:31:49.083344 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:31:49.083321 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d3811b3_7e75_4345_b591_277c5aecb5fd.slice/crio-9121e43d033a8d2cbf40c970283133d5fa23c4ee5594947479a3c6ac6e714ccc WatchSource:0}: Error finding container 9121e43d033a8d2cbf40c970283133d5fa23c4ee5594947479a3c6ac6e714ccc: Status 404 returned error can't find the container with id 9121e43d033a8d2cbf40c970283133d5fa23c4ee5594947479a3c6ac6e714ccc Apr 20 13:31:49.094654 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:49.094632 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hdgh8"] Apr 20 13:31:49.097652 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:31:49.097628 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e1662ff_63f6_4f08_9e96_75f038878584.slice/crio-fc84527e94f0d862fb228c183dc5eeafd9d8c7218dd427219312a3951a40cabd WatchSource:0}: Error finding container fc84527e94f0d862fb228c183dc5eeafd9d8c7218dd427219312a3951a40cabd: Status 404 returned error can't find the container with id fc84527e94f0d862fb228c183dc5eeafd9d8c7218dd427219312a3951a40cabd Apr 20 13:31:49.249602 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:49.249509 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p7fbq" event={"ID":"8d3811b3-7e75-4345-b591-277c5aecb5fd","Type":"ContainerStarted","Data":"9121e43d033a8d2cbf40c970283133d5fa23c4ee5594947479a3c6ac6e714ccc"} Apr 20 13:31:49.250951 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:49.250923 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" event={"ID":"1314261e-ac1b-4ca4-8c20-11cf3ad0f281","Type":"ContainerStarted","Data":"956471d92f2654f956f36cecb45fdd7f0f1608184cf67ee52f74509ac25ea330"} Apr 20 13:31:49.251096 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:49.250954 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" event={"ID":"1314261e-ac1b-4ca4-8c20-11cf3ad0f281","Type":"ContainerStarted","Data":"c6795ff156e96bd0a5a4bf014dd525145acd4b07264c1fce864aef824104d860"} Apr 20 13:31:49.251174 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:49.251104 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:31:49.251967 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:49.251947 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hdgh8" event={"ID":"8e1662ff-63f6-4f08-9e96-75f038878584","Type":"ContainerStarted","Data":"fc84527e94f0d862fb228c183dc5eeafd9d8c7218dd427219312a3951a40cabd"} Apr 20 13:31:49.273035 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:49.272981 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" podStartSLOduration=65.272968355 podStartE2EDuration="1m5.272968355s" podCreationTimestamp="2026-04-20 13:30:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:31:49.272271628 +0000 UTC m=+65.867962704" watchObservedRunningTime="2026-04-20 13:31:49.272968355 +0000 UTC m=+65.868659431" Apr 20 13:31:49.721064 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:49.721030 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs\") pod \"network-metrics-daemon-5w9cl\" (UID: \"0de99a89-e8e5-491a-90c3-5c371ed6705f\") " pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:31:49.723846 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:49.723703 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0de99a89-e8e5-491a-90c3-5c371ed6705f-metrics-certs\") pod \"network-metrics-daemon-5w9cl\" (UID: \"0de99a89-e8e5-491a-90c3-5c371ed6705f\") " pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:31:50.003775 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:50.003703 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gqjcl\"" Apr 20 13:31:50.011780 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:50.011753 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5w9cl" Apr 20 13:31:50.174828 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:50.174794 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5w9cl"] Apr 20 13:31:50.178627 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:31:50.178596 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0de99a89_e8e5_491a_90c3_5c371ed6705f.slice/crio-95ced9d02ce95d0242ef0f037e4dc18a8fe3206a5f542199395d1fcf76468af5 WatchSource:0}: Error finding container 95ced9d02ce95d0242ef0f037e4dc18a8fe3206a5f542199395d1fcf76468af5: Status 404 returned error can't find the container with id 95ced9d02ce95d0242ef0f037e4dc18a8fe3206a5f542199395d1fcf76468af5 Apr 20 13:31:50.256806 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:50.256725 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5w9cl" event={"ID":"0de99a89-e8e5-491a-90c3-5c371ed6705f","Type":"ContainerStarted","Data":"95ced9d02ce95d0242ef0f037e4dc18a8fe3206a5f542199395d1fcf76468af5"} Apr 20 13:31:52.264103 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.264067 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hdgh8" event={"ID":"8e1662ff-63f6-4f08-9e96-75f038878584","Type":"ContainerStarted","Data":"919d0188bd9d019630ef25ff0370fa2ec4bd4542f756d3ed38aaf96a8b20e440"} Apr 20 13:31:52.280649 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.280604 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hdgh8" podStartSLOduration=34.177406915 podStartE2EDuration="36.280592036s" podCreationTimestamp="2026-04-20 13:31:16 +0000 UTC" firstStartedPulling="2026-04-20 13:31:49.099593025 +0000 UTC m=+65.695284093" lastFinishedPulling="2026-04-20 13:31:51.202778147 +0000 UTC m=+67.798469214" observedRunningTime="2026-04-20 13:31:52.279081027 +0000 UTC m=+68.874772103" watchObservedRunningTime="2026-04-20 13:31:52.280592036 +0000 UTC m=+68.876283090" Apr 20 13:31:52.556955 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.556916 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbd866957-8zfnz"] Apr 20 13:31:52.576659 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.576626 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbd866957-8zfnz"] Apr 20 13:31:52.576786 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.576736 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbd866957-8zfnz" Apr 20 13:31:52.583068 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.583045 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 13:31:52.583068 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.583059 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 13:31:52.583620 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.583603 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-tch4b\"" Apr 20 13:31:52.583620 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.583614 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 13:31:52.583751 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.583690 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 13:31:52.643743 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.643717 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65bwl\" (UniqueName: \"kubernetes.io/projected/c516b2cf-a666-4344-972a-180f9a0dea94-kube-api-access-65bwl\") pod \"managed-serviceaccount-addon-agent-dbd866957-8zfnz\" (UID: \"c516b2cf-a666-4344-972a-180f9a0dea94\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbd866957-8zfnz" Apr 20 13:31:52.643882 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.643770 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c516b2cf-a666-4344-972a-180f9a0dea94-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-dbd866957-8zfnz\" (UID: \"c516b2cf-a666-4344-972a-180f9a0dea94\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbd866957-8zfnz" Apr 20 13:31:52.664804 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.664778 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-b5jqm"] Apr 20 13:31:52.683341 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.683318 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-b5jqm" Apr 20 13:31:52.684129 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.684086 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-b5jqm"] Apr 20 13:31:52.685627 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.685608 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-zm7f5\"" Apr 20 13:31:52.685713 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.685631 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 13:31:52.686003 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.685987 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 13:31:52.744683 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.744661 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/65ad83cd-9d60-4721-b86e-87436c3f0696-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-b5jqm\" (UID: \"65ad83cd-9d60-4721-b86e-87436c3f0696\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-b5jqm" Apr 20 13:31:52.744766 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.744695 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65bwl\" (UniqueName: \"kubernetes.io/projected/c516b2cf-a666-4344-972a-180f9a0dea94-kube-api-access-65bwl\") pod \"managed-serviceaccount-addon-agent-dbd866957-8zfnz\" (UID: \"c516b2cf-a666-4344-972a-180f9a0dea94\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbd866957-8zfnz" Apr 20 13:31:52.744815 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.744780 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c516b2cf-a666-4344-972a-180f9a0dea94-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-dbd866957-8zfnz\" (UID: \"c516b2cf-a666-4344-972a-180f9a0dea94\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbd866957-8zfnz" Apr 20 13:31:52.744898 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.744880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/65ad83cd-9d60-4721-b86e-87436c3f0696-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-b5jqm\" (UID: \"65ad83cd-9d60-4721-b86e-87436c3f0696\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-b5jqm" Apr 20 13:31:52.747314 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.747292 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c516b2cf-a666-4344-972a-180f9a0dea94-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-dbd866957-8zfnz\" (UID: \"c516b2cf-a666-4344-972a-180f9a0dea94\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbd866957-8zfnz" Apr 20 13:31:52.761215 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.761192 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65bwl\" (UniqueName: \"kubernetes.io/projected/c516b2cf-a666-4344-972a-180f9a0dea94-kube-api-access-65bwl\") pod \"managed-serviceaccount-addon-agent-dbd866957-8zfnz\" (UID: \"c516b2cf-a666-4344-972a-180f9a0dea94\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbd866957-8zfnz" Apr 20 13:31:52.845654 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.845630 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/65ad83cd-9d60-4721-b86e-87436c3f0696-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-b5jqm\" (UID: \"65ad83cd-9d60-4721-b86e-87436c3f0696\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-b5jqm" Apr 20 13:31:52.845768 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.845745 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/65ad83cd-9d60-4721-b86e-87436c3f0696-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-b5jqm\" (UID: \"65ad83cd-9d60-4721-b86e-87436c3f0696\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-b5jqm" Apr 20 13:31:52.846664 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.846639 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/65ad83cd-9d60-4721-b86e-87436c3f0696-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-b5jqm\" (UID: \"65ad83cd-9d60-4721-b86e-87436c3f0696\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-b5jqm" Apr 20 13:31:52.848313 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.848294 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/65ad83cd-9d60-4721-b86e-87436c3f0696-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-b5jqm\" (UID: \"65ad83cd-9d60-4721-b86e-87436c3f0696\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-b5jqm" Apr 20 13:31:52.898968 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.898947 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbd866957-8zfnz" Apr 20 13:31:52.962158 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.962094 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:52.962158 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.962151 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:31:52.962634 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.962567 2573 scope.go:117] "RemoveContainer" containerID="59cca0501ca7bd97052289b06681a289fedf2533dea710fa46ff3d17a6555d29" Apr 20 13:31:52.962824 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:31:52.962801 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-rtbvg_openshift-console-operator(409c02a3-0a51-4fe6-813b-cc03f7497104)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" podUID="409c02a3-0a51-4fe6-813b-cc03f7497104" Apr 20 13:31:52.993672 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:52.993378 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-b5jqm" Apr 20 13:31:53.047300 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:53.047205 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbd866957-8zfnz"] Apr 20 13:31:53.051414 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:31:53.051381 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc516b2cf_a666_4344_972a_180f9a0dea94.slice/crio-12a75f69379ff8ba430d4374407cbc3575dab4f3a5effba4799f9a7b4ce1932c WatchSource:0}: Error finding container 12a75f69379ff8ba430d4374407cbc3575dab4f3a5effba4799f9a7b4ce1932c: Status 404 returned error can't find the container with id 12a75f69379ff8ba430d4374407cbc3575dab4f3a5effba4799f9a7b4ce1932c Apr 20 13:31:53.157373 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:53.157352 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-b5jqm"] Apr 20 13:31:53.159374 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:31:53.159350 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65ad83cd_9d60_4721_b86e_87436c3f0696.slice/crio-96f58f62c4240a0a154a2cfca5986fba31f6ec628b61a2dd5dffa856c28d997e WatchSource:0}: Error finding container 96f58f62c4240a0a154a2cfca5986fba31f6ec628b61a2dd5dffa856c28d997e: Status 404 returned error can't find the container with id 96f58f62c4240a0a154a2cfca5986fba31f6ec628b61a2dd5dffa856c28d997e Apr 20 13:31:53.268306 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:53.268269 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-b5jqm" event={"ID":"65ad83cd-9d60-4721-b86e-87436c3f0696","Type":"ContainerStarted","Data":"96f58f62c4240a0a154a2cfca5986fba31f6ec628b61a2dd5dffa856c28d997e"} Apr 20 13:31:53.269817 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:53.269789 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p7fbq" event={"ID":"8d3811b3-7e75-4345-b591-277c5aecb5fd","Type":"ContainerStarted","Data":"5ba056c1ab8a5b120bc3df6b2da0e78e030f6a566a062b466247d3ce8691124b"} Apr 20 13:31:53.269817 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:53.269815 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p7fbq" event={"ID":"8d3811b3-7e75-4345-b591-277c5aecb5fd","Type":"ContainerStarted","Data":"b5d976cbc101fff27e82f21ba0466cb828615daa26a1cda992fbcfd3065e4d55"} Apr 20 13:31:53.269985 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:53.269892 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-p7fbq" Apr 20 13:31:53.271340 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:53.271311 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5w9cl" event={"ID":"0de99a89-e8e5-491a-90c3-5c371ed6705f","Type":"ContainerStarted","Data":"58002063377e0bfa9cf310bc85efce48ab642dbdace715ea652d8b6171eb41d7"} Apr 20 13:31:53.271340 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:53.271337 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5w9cl" event={"ID":"0de99a89-e8e5-491a-90c3-5c371ed6705f","Type":"ContainerStarted","Data":"be556836027b31e4fe7b30f1af6b54114b2c6ef0d212d6346ce57bf0197cb78e"} Apr 20 13:31:53.272281 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:53.272262 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbd866957-8zfnz" event={"ID":"c516b2cf-a666-4344-972a-180f9a0dea94","Type":"ContainerStarted","Data":"12a75f69379ff8ba430d4374407cbc3575dab4f3a5effba4799f9a7b4ce1932c"} Apr 20 13:31:53.288942 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:53.288894 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p7fbq" podStartSLOduration=33.655111768 podStartE2EDuration="37.288883345s" podCreationTimestamp="2026-04-20 13:31:16 +0000 UTC" firstStartedPulling="2026-04-20 13:31:49.084903807 +0000 UTC m=+65.680594865" lastFinishedPulling="2026-04-20 13:31:52.718675375 +0000 UTC m=+69.314366442" observedRunningTime="2026-04-20 13:31:53.287669478 +0000 UTC m=+69.883360554" watchObservedRunningTime="2026-04-20 13:31:53.288883345 +0000 UTC m=+69.884574420" Apr 20 13:31:53.309159 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:53.309093 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5w9cl" podStartSLOduration=66.774701949 podStartE2EDuration="1m9.309076054s" podCreationTimestamp="2026-04-20 13:30:44 +0000 UTC" firstStartedPulling="2026-04-20 13:31:50.18083064 +0000 UTC m=+66.776521695" lastFinishedPulling="2026-04-20 13:31:52.715204731 +0000 UTC m=+69.310895800" observedRunningTime="2026-04-20 13:31:53.3076005 +0000 UTC m=+69.903291576" watchObservedRunningTime="2026-04-20 13:31:53.309076054 +0000 UTC m=+69.904767131" Apr 20 13:31:54.461084 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:54.461050 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:54.461457 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:54.461103 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:54.461810 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:54.461783 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a70ca6b1-f55d-4081-b09f-dd5454b489d3-service-ca-bundle\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:54.463388 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:54.463362 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a70ca6b1-f55d-4081-b09f-dd5454b489d3-metrics-certs\") pod \"router-default-67666b9c78-x7c4n\" (UID: \"a70ca6b1-f55d-4081-b09f-dd5454b489d3\") " pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:54.505482 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:54.505459 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-vwqlb\"" Apr 20 13:31:54.513293 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:54.513269 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:54.562362 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:54.562328 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vtms5\" (UID: \"27fd6d03-d487-4763-a29e-c24f39dbeb32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:54.564884 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:54.564862 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27fd6d03-d487-4763-a29e-c24f39dbeb32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vtms5\" (UID: \"27fd6d03-d487-4763-a29e-c24f39dbeb32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:54.848755 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:54.848723 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-f64q9\"" Apr 20 13:31:54.857462 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:54.857432 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" Apr 20 13:31:55.171563 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:55.171485 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-sb687" Apr 20 13:31:55.981118 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:55.981093 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-67666b9c78-x7c4n"] Apr 20 13:31:55.983331 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:31:55.983290 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda70ca6b1_f55d_4081_b09f_dd5454b489d3.slice/crio-b46179efd36a57b58409a036919b2a4f82ad473e5604cdff7a37531a5b0daae1 WatchSource:0}: Error finding container b46179efd36a57b58409a036919b2a4f82ad473e5604cdff7a37531a5b0daae1: Status 404 returned error can't find the container with id b46179efd36a57b58409a036919b2a4f82ad473e5604cdff7a37531a5b0daae1 Apr 20 13:31:55.995828 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:55.995806 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5"] Apr 20 13:31:55.998645 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:31:55.998622 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27fd6d03_d487_4763_a29e_c24f39dbeb32.slice/crio-9e4839575f903fcfc7655014a42ee0ff3a396d4cd2255a208e740e4fff1c3814 WatchSource:0}: Error finding container 9e4839575f903fcfc7655014a42ee0ff3a396d4cd2255a208e740e4fff1c3814: Status 404 returned error can't find the container with id 9e4839575f903fcfc7655014a42ee0ff3a396d4cd2255a208e740e4fff1c3814 Apr 20 13:31:56.280833 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:56.280759 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" event={"ID":"27fd6d03-d487-4763-a29e-c24f39dbeb32","Type":"ContainerStarted","Data":"9e4839575f903fcfc7655014a42ee0ff3a396d4cd2255a208e740e4fff1c3814"} Apr 20 13:31:56.282025 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:56.282000 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-b5jqm" event={"ID":"65ad83cd-9d60-4721-b86e-87436c3f0696","Type":"ContainerStarted","Data":"111d5a456abb35e8139435d6c9a3411aedaa651e399f99c558241fd2e76965bf"} Apr 20 13:31:56.283370 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:56.283345 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-67666b9c78-x7c4n" event={"ID":"a70ca6b1-f55d-4081-b09f-dd5454b489d3","Type":"ContainerStarted","Data":"8410445deb1a05b58164e958246827f6ac34af0c340acc052baa262ee6c8a898"} Apr 20 13:31:56.283527 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:56.283376 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-67666b9c78-x7c4n" event={"ID":"a70ca6b1-f55d-4081-b09f-dd5454b489d3","Type":"ContainerStarted","Data":"b46179efd36a57b58409a036919b2a4f82ad473e5604cdff7a37531a5b0daae1"} Apr 20 13:31:56.302756 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:56.302703 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-b5jqm" podStartSLOduration=1.60753435 podStartE2EDuration="4.302686348s" podCreationTimestamp="2026-04-20 13:31:52 +0000 UTC" firstStartedPulling="2026-04-20 13:31:53.161115946 +0000 UTC m=+69.756807004" lastFinishedPulling="2026-04-20 13:31:55.856267929 +0000 UTC m=+72.451959002" observedRunningTime="2026-04-20 13:31:56.301485405 +0000 UTC m=+72.897176482" watchObservedRunningTime="2026-04-20 13:31:56.302686348 +0000 UTC m=+72.898377463" Apr 20 13:31:56.322235 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:56.322156 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-67666b9c78-x7c4n" podStartSLOduration=34.322124238 podStartE2EDuration="34.322124238s" podCreationTimestamp="2026-04-20 13:31:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:31:56.32180643 +0000 UTC m=+72.917497517" watchObservedRunningTime="2026-04-20 13:31:56.322124238 +0000 UTC m=+72.917815313" Apr 20 13:31:56.513760 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:56.513711 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:56.516050 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:56.516031 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:57.290748 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:57.290704 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbd866957-8zfnz" event={"ID":"c516b2cf-a666-4344-972a-180f9a0dea94","Type":"ContainerStarted","Data":"d386c49f130acfad56e892a65d0c7c5ebb19e66627ad7b1c64e925039ce1d299"} Apr 20 13:31:57.291242 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:57.291073 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:57.292425 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:57.292405 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-67666b9c78-x7c4n" Apr 20 13:31:57.308127 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:57.308089 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-dbd866957-8zfnz" podStartSLOduration=2.071839796 podStartE2EDuration="5.308078139s" podCreationTimestamp="2026-04-20 13:31:52 +0000 UTC" firstStartedPulling="2026-04-20 13:31:53.053930713 +0000 UTC m=+69.649621773" lastFinishedPulling="2026-04-20 13:31:56.290169062 +0000 UTC m=+72.885860116" observedRunningTime="2026-04-20 13:31:57.307321887 +0000 UTC m=+73.903012959" watchObservedRunningTime="2026-04-20 13:31:57.308078139 +0000 UTC m=+73.903769228" Apr 20 13:31:59.298186 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:31:59.298132 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" event={"ID":"27fd6d03-d487-4763-a29e-c24f39dbeb32","Type":"ContainerStarted","Data":"e7a916f03a8053dfd14331ecbab40fa813c4a1ad53954f63010dae7e41c137b0"} Apr 20 13:32:03.277840 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:03.277736 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p7fbq" Apr 20 13:32:03.297676 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:03.297608 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vtms5" podStartSLOduration=38.451084731 podStartE2EDuration="41.297594454s" podCreationTimestamp="2026-04-20 13:31:22 +0000 UTC" firstStartedPulling="2026-04-20 13:31:56.000718263 +0000 UTC m=+72.596409329" lastFinishedPulling="2026-04-20 13:31:58.847227983 +0000 UTC m=+75.442919052" observedRunningTime="2026-04-20 13:31:59.323670445 +0000 UTC m=+75.919361523" watchObservedRunningTime="2026-04-20 13:32:03.297594454 +0000 UTC m=+79.893285525" Apr 20 13:32:04.975743 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:04.975713 2573 scope.go:117] "RemoveContainer" containerID="59cca0501ca7bd97052289b06681a289fedf2533dea710fa46ff3d17a6555d29" Apr 20 13:32:04.976129 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:32:04.975902 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-rtbvg_openshift-console-operator(409c02a3-0a51-4fe6-813b-cc03f7497104)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" podUID="409c02a3-0a51-4fe6-813b-cc03f7497104" Apr 20 13:32:08.926335 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:08.926290 2573 patch_prober.go:28] interesting pod/image-registry-7889fdc99c-s6dfk container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 13:32:08.926759 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:08.926353 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" podUID="1314261e-ac1b-4ca4-8c20-11cf3ad0f281" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 13:32:09.905336 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:09.905305 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-gx9q4"] Apr 20 13:32:09.909756 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:09.909735 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:09.911960 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:09.911940 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 13:32:09.911960 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:09.911957 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 13:32:09.912113 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:09.911977 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-dzwpq\"" Apr 20 13:32:09.912321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:09.912303 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 13:32:09.912844 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:09.912832 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 13:32:10.072734 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.072703 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-metrics-client-ca\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.073126 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.072742 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.073126 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.072763 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-sys\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.073126 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.072787 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-accelerators-collector-config\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.073126 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.072834 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-root\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.073126 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.072850 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wv5j\" (UniqueName: \"kubernetes.io/projected/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-kube-api-access-5wv5j\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.073126 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.072870 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-textfile\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.073126 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.072886 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-wtmp\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.073126 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.072943 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-tls\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.174128 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.174035 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-metrics-client-ca\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.174128 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.174080 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.174128 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.174099 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-sys\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.174128 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.174116 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-accelerators-collector-config\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.174404 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.174163 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-root\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.174404 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.174183 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5wv5j\" (UniqueName: \"kubernetes.io/projected/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-kube-api-access-5wv5j\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.174404 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.174203 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-textfile\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.174404 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.174225 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-wtmp\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.174404 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.174256 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-tls\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.174404 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:32:10.174355 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 13:32:10.174578 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:32:10.174408 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-tls podName:0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5 nodeName:}" failed. No retries permitted until 2026-04-20 13:32:10.674390887 +0000 UTC m=+87.270081942 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-tls") pod "node-exporter-gx9q4" (UID: "0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5") : secret "node-exporter-tls" not found Apr 20 13:32:10.174683 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.174663 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-metrics-client-ca\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.175031 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.174999 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-root\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.175179 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.175068 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-sys\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.175262 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.175210 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-wtmp\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.175321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.175289 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-accelerators-collector-config\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.175523 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.175500 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-textfile\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.176795 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.176774 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.184365 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.184342 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wv5j\" (UniqueName: \"kubernetes.io/projected/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-kube-api-access-5wv5j\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.260935 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.260913 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:32:10.679231 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.679198 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-tls\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.681561 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.681534 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5-node-exporter-tls\") pod \"node-exporter-gx9q4\" (UID: \"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5\") " pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.818472 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:10.818441 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gx9q4" Apr 20 13:32:10.826695 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:32:10.826668 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dc9fc48_225a_4d0f_80cf_e171c1dfe6d5.slice/crio-def549aef1c1e91d45d982149aba9ce31a3bf2c1c7d5c6b23b02c27851415ddf WatchSource:0}: Error finding container def549aef1c1e91d45d982149aba9ce31a3bf2c1c7d5c6b23b02c27851415ddf: Status 404 returned error can't find the container with id def549aef1c1e91d45d982149aba9ce31a3bf2c1c7d5c6b23b02c27851415ddf Apr 20 13:32:11.330729 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:11.330691 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gx9q4" event={"ID":"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5","Type":"ContainerStarted","Data":"def549aef1c1e91d45d982149aba9ce31a3bf2c1c7d5c6b23b02c27851415ddf"} Apr 20 13:32:12.334788 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:12.334748 2573 generic.go:358] "Generic (PLEG): container finished" podID="0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5" containerID="820ae5472d1f2ec2f0bf27f7a49658f95b4c47bc11dcc967d88c50c38db8e07d" exitCode=0 Apr 20 13:32:12.335184 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:12.334811 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gx9q4" event={"ID":"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5","Type":"ContainerDied","Data":"820ae5472d1f2ec2f0bf27f7a49658f95b4c47bc11dcc967d88c50c38db8e07d"} Apr 20 13:32:13.339120 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:13.339087 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gx9q4" event={"ID":"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5","Type":"ContainerStarted","Data":"c4c97bae3c65a72058626003755bc1de91452738d98c88bad38e5386fb8545a2"} Apr 20 13:32:13.339120 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:13.339123 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gx9q4" event={"ID":"0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5","Type":"ContainerStarted","Data":"885b196def7ffdf038e28409423fd7737d7f9338c78f6edc87df15487658440e"} Apr 20 13:32:13.368020 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:13.367979 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-gx9q4" podStartSLOduration=3.722417531 podStartE2EDuration="4.367966778s" podCreationTimestamp="2026-04-20 13:32:09 +0000 UTC" firstStartedPulling="2026-04-20 13:32:10.8282171 +0000 UTC m=+87.423908153" lastFinishedPulling="2026-04-20 13:32:11.473766345 +0000 UTC m=+88.069457400" observedRunningTime="2026-04-20 13:32:13.367412677 +0000 UTC m=+89.963103753" watchObservedRunningTime="2026-04-20 13:32:13.367966778 +0000 UTC m=+89.963657853" Apr 20 13:32:14.493721 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:14.493678 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7889fdc99c-s6dfk"] Apr 20 13:32:19.976037 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:19.976006 2573 scope.go:117] "RemoveContainer" containerID="59cca0501ca7bd97052289b06681a289fedf2533dea710fa46ff3d17a6555d29" Apr 20 13:32:20.360369 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.360335 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 13:32:20.360563 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.360395 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" event={"ID":"409c02a3-0a51-4fe6-813b-cc03f7497104","Type":"ContainerStarted","Data":"7c84a1f20f5e3a2ad5d7593baea3321d4782373f40a74d3e85fb9ccec7f2fb97"} Apr 20 13:32:20.360706 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.360685 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:32:20.367160 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.367115 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" Apr 20 13:32:20.378459 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.378419 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-rtbvg" podStartSLOduration=55.218968526 podStartE2EDuration="58.378354859s" podCreationTimestamp="2026-04-20 13:31:22 +0000 UTC" firstStartedPulling="2026-04-20 13:31:23.178420246 +0000 UTC m=+39.774111303" lastFinishedPulling="2026-04-20 13:31:26.33780658 +0000 UTC m=+42.933497636" observedRunningTime="2026-04-20 13:32:20.378310716 +0000 UTC m=+96.974001809" watchObservedRunningTime="2026-04-20 13:32:20.378354859 +0000 UTC m=+96.974045934" Apr 20 13:32:20.513159 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.513110 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-b7824"] Apr 20 13:32:20.516282 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.516262 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-b7824" Apr 20 13:32:20.518969 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.518950 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-4wl7z\"" Apr 20 13:32:20.519265 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.519234 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 13:32:20.520241 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.520224 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 13:32:20.539368 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.539343 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-b7824"] Apr 20 13:32:20.658379 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.658288 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w79p\" (UniqueName: \"kubernetes.io/projected/248d400c-8b1e-42d8-a2ec-ba381005c2c7-kube-api-access-5w79p\") pod \"downloads-6bcc868b7-b7824\" (UID: \"248d400c-8b1e-42d8-a2ec-ba381005c2c7\") " pod="openshift-console/downloads-6bcc868b7-b7824" Apr 20 13:32:20.759508 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.759477 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5w79p\" (UniqueName: \"kubernetes.io/projected/248d400c-8b1e-42d8-a2ec-ba381005c2c7-kube-api-access-5w79p\") pod \"downloads-6bcc868b7-b7824\" (UID: \"248d400c-8b1e-42d8-a2ec-ba381005c2c7\") " pod="openshift-console/downloads-6bcc868b7-b7824" Apr 20 13:32:20.766943 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.766914 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w79p\" (UniqueName: \"kubernetes.io/projected/248d400c-8b1e-42d8-a2ec-ba381005c2c7-kube-api-access-5w79p\") pod \"downloads-6bcc868b7-b7824\" (UID: \"248d400c-8b1e-42d8-a2ec-ba381005c2c7\") " pod="openshift-console/downloads-6bcc868b7-b7824" Apr 20 13:32:20.825324 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.825287 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-b7824" Apr 20 13:32:20.942745 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:20.942711 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-b7824"] Apr 20 13:32:20.945181 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:32:20.945133 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod248d400c_8b1e_42d8_a2ec_ba381005c2c7.slice/crio-5f309ec77a7f8077b2c45b4915319df46dbb635a8929f101e1fd78028430bb90 WatchSource:0}: Error finding container 5f309ec77a7f8077b2c45b4915319df46dbb635a8929f101e1fd78028430bb90: Status 404 returned error can't find the container with id 5f309ec77a7f8077b2c45b4915319df46dbb635a8929f101e1fd78028430bb90 Apr 20 13:32:21.363675 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:21.363638 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-b7824" event={"ID":"248d400c-8b1e-42d8-a2ec-ba381005c2c7","Type":"ContainerStarted","Data":"5f309ec77a7f8077b2c45b4915319df46dbb635a8929f101e1fd78028430bb90"} Apr 20 13:32:29.269563 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.269528 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55bb49c67f-sj9bz"] Apr 20 13:32:29.334071 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.334037 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55bb49c67f-sj9bz"] Apr 20 13:32:29.334259 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.334182 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.339743 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.339666 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 13:32:29.340637 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.340610 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 13:32:29.340788 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.340650 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 13:32:29.340788 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.340710 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-4vhlx\"" Apr 20 13:32:29.340788 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.340652 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 13:32:29.341314 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.341280 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 13:32:29.427685 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.427649 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-console-config\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.427861 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.427726 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-service-ca\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.427861 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.427748 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smmck\" (UniqueName: \"kubernetes.io/projected/41f3d115-5280-417c-90a7-e966833bc357-kube-api-access-smmck\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.427861 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.427780 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-oauth-serving-cert\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.428009 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.427860 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/41f3d115-5280-417c-90a7-e966833bc357-console-oauth-config\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.428009 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.427920 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/41f3d115-5280-417c-90a7-e966833bc357-console-serving-cert\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.529437 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.529309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-service-ca\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.529437 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.529358 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smmck\" (UniqueName: \"kubernetes.io/projected/41f3d115-5280-417c-90a7-e966833bc357-kube-api-access-smmck\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.529437 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.529398 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-oauth-serving-cert\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.529721 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.529442 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/41f3d115-5280-417c-90a7-e966833bc357-console-oauth-config\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.529721 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.529487 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/41f3d115-5280-417c-90a7-e966833bc357-console-serving-cert\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.529721 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.529522 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-console-config\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.530200 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.530168 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-oauth-serving-cert\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.530200 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.530185 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-console-config\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.530200 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.530175 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-service-ca\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.532364 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.532342 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/41f3d115-5280-417c-90a7-e966833bc357-console-oauth-config\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.542573 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.542527 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/41f3d115-5280-417c-90a7-e966833bc357-console-serving-cert\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.542834 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.542808 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smmck\" (UniqueName: \"kubernetes.io/projected/41f3d115-5280-417c-90a7-e966833bc357-kube-api-access-smmck\") pod \"console-55bb49c67f-sj9bz\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.645922 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.645881 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:29.813137 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:29.813085 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55bb49c67f-sj9bz"] Apr 20 13:32:29.816705 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:32:29.816675 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41f3d115_5280_417c_90a7_e966833bc357.slice/crio-03eb474a28d2f0f9c2c9d965ab48dcf2e744fc9d070c9b6bbf5959f633435b0a WatchSource:0}: Error finding container 03eb474a28d2f0f9c2c9d965ab48dcf2e744fc9d070c9b6bbf5959f633435b0a: Status 404 returned error can't find the container with id 03eb474a28d2f0f9c2c9d965ab48dcf2e744fc9d070c9b6bbf5959f633435b0a Apr 20 13:32:30.388329 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:30.388291 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55bb49c67f-sj9bz" event={"ID":"41f3d115-5280-417c-90a7-e966833bc357","Type":"ContainerStarted","Data":"03eb474a28d2f0f9c2c9d965ab48dcf2e744fc9d070c9b6bbf5959f633435b0a"} Apr 20 13:32:33.400914 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:33.400870 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55bb49c67f-sj9bz" event={"ID":"41f3d115-5280-417c-90a7-e966833bc357","Type":"ContainerStarted","Data":"7c1074c7810d4136ae3f7013f02752419aeb2514ed1c973923cf53c505711091"} Apr 20 13:32:37.820329 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:37.820265 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55bb49c67f-sj9bz" podStartSLOduration=5.760276378 podStartE2EDuration="8.820248173s" podCreationTimestamp="2026-04-20 13:32:29 +0000 UTC" firstStartedPulling="2026-04-20 13:32:29.8186812 +0000 UTC m=+106.414372260" lastFinishedPulling="2026-04-20 13:32:32.878652998 +0000 UTC m=+109.474344055" observedRunningTime="2026-04-20 13:32:33.444013813 +0000 UTC m=+110.039704891" watchObservedRunningTime="2026-04-20 13:32:37.820248173 +0000 UTC m=+114.415939247" Apr 20 13:32:37.820827 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:37.820500 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d4898ffc7-pmffs"] Apr 20 13:32:37.823264 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:37.823243 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:37.831899 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:37.831876 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 13:32:37.837427 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:37.837406 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d4898ffc7-pmffs"] Apr 20 13:32:37.913764 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:37.913733 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-serving-cert\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:37.913949 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:37.913781 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-oauth-config\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:37.913949 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:37.913835 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxgl4\" (UniqueName: \"kubernetes.io/projected/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-kube-api-access-pxgl4\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:37.913949 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:37.913863 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-config\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:37.913949 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:37.913926 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-service-ca\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:37.914179 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:37.913958 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-trusted-ca-bundle\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:37.914179 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:37.913992 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-oauth-serving-cert\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:38.014841 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:38.014799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-serving-cert\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:38.015034 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:38.014864 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-oauth-config\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:38.015034 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:38.014907 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxgl4\" (UniqueName: \"kubernetes.io/projected/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-kube-api-access-pxgl4\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:38.015034 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:38.014943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-config\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:38.015034 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:38.014971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-service-ca\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:38.015034 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:38.014995 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-trusted-ca-bundle\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:38.015034 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:38.015022 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-oauth-serving-cert\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:38.015755 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:38.015697 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-config\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:38.015887 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:38.015799 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-oauth-serving-cert\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:38.016254 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:38.016234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-service-ca\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:38.016579 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:38.016554 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-trusted-ca-bundle\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:38.017769 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:38.017738 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-serving-cert\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:38.018065 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:38.018033 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-oauth-config\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:38.027384 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:38.027364 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxgl4\" (UniqueName: \"kubernetes.io/projected/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-kube-api-access-pxgl4\") pod \"console-7d4898ffc7-pmffs\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:38.133284 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:38.133194 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:39.512685 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:39.512605 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" podUID="1314261e-ac1b-4ca4-8c20-11cf3ad0f281" containerName="registry" containerID="cri-o://956471d92f2654f956f36cecb45fdd7f0f1608184cf67ee52f74509ac25ea330" gracePeriod=30 Apr 20 13:32:39.646791 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:39.646755 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:39.646953 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:39.646809 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:39.654267 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:39.653650 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:40.237887 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.237869 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:32:40.336717 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.336684 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-image-registry-private-configuration\") pod \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " Apr 20 13:32:40.336886 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.336731 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsbtj\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-kube-api-access-gsbtj\") pod \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " Apr 20 13:32:40.336886 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.336764 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-bound-sa-token\") pod \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " Apr 20 13:32:40.336886 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.336806 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-trusted-ca\") pod \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " Apr 20 13:32:40.336886 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.336821 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-certificates\") pod \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " Apr 20 13:32:40.336886 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.336847 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-installation-pull-secrets\") pod \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " Apr 20 13:32:40.336886 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.336871 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls\") pod \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " Apr 20 13:32:40.337207 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.336900 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-ca-trust-extracted\") pod \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\" (UID: \"1314261e-ac1b-4ca4-8c20-11cf3ad0f281\") " Apr 20 13:32:40.337506 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.337464 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1314261e-ac1b-4ca4-8c20-11cf3ad0f281" (UID: "1314261e-ac1b-4ca4-8c20-11cf3ad0f281"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:32:40.337624 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.337513 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1314261e-ac1b-4ca4-8c20-11cf3ad0f281" (UID: "1314261e-ac1b-4ca4-8c20-11cf3ad0f281"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:32:40.339568 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.339517 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1314261e-ac1b-4ca4-8c20-11cf3ad0f281" (UID: "1314261e-ac1b-4ca4-8c20-11cf3ad0f281"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:32:40.339842 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.339749 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "1314261e-ac1b-4ca4-8c20-11cf3ad0f281" (UID: "1314261e-ac1b-4ca4-8c20-11cf3ad0f281"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:32:40.339842 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.339807 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-kube-api-access-gsbtj" (OuterVolumeSpecName: "kube-api-access-gsbtj") pod "1314261e-ac1b-4ca4-8c20-11cf3ad0f281" (UID: "1314261e-ac1b-4ca4-8c20-11cf3ad0f281"). InnerVolumeSpecName "kube-api-access-gsbtj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:32:40.339990 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.339956 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1314261e-ac1b-4ca4-8c20-11cf3ad0f281" (UID: "1314261e-ac1b-4ca4-8c20-11cf3ad0f281"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:32:40.340033 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.339993 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1314261e-ac1b-4ca4-8c20-11cf3ad0f281" (UID: "1314261e-ac1b-4ca4-8c20-11cf3ad0f281"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:32:40.346724 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.346670 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1314261e-ac1b-4ca4-8c20-11cf3ad0f281" (UID: "1314261e-ac1b-4ca4-8c20-11cf3ad0f281"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:32:40.421499 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.421363 2573 generic.go:358] "Generic (PLEG): container finished" podID="1314261e-ac1b-4ca4-8c20-11cf3ad0f281" containerID="956471d92f2654f956f36cecb45fdd7f0f1608184cf67ee52f74509ac25ea330" exitCode=0 Apr 20 13:32:40.421499 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.421439 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" Apr 20 13:32:40.421700 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.421630 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" event={"ID":"1314261e-ac1b-4ca4-8c20-11cf3ad0f281","Type":"ContainerDied","Data":"956471d92f2654f956f36cecb45fdd7f0f1608184cf67ee52f74509ac25ea330"} Apr 20 13:32:40.421700 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.421677 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7889fdc99c-s6dfk" event={"ID":"1314261e-ac1b-4ca4-8c20-11cf3ad0f281","Type":"ContainerDied","Data":"c6795ff156e96bd0a5a4bf014dd525145acd4b07264c1fce864aef824104d860"} Apr 20 13:32:40.421700 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.421700 2573 scope.go:117] "RemoveContainer" containerID="956471d92f2654f956f36cecb45fdd7f0f1608184cf67ee52f74509ac25ea330" Apr 20 13:32:40.423446 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.423376 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-b7824" event={"ID":"248d400c-8b1e-42d8-a2ec-ba381005c2c7","Type":"ContainerStarted","Data":"94090ea51260afbc38c040bc227039146af8ed80bb4f893499fd5f205db1e9a9"} Apr 20 13:32:40.428948 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.428910 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:32:40.433381 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.433352 2573 scope.go:117] "RemoveContainer" containerID="956471d92f2654f956f36cecb45fdd7f0f1608184cf67ee52f74509ac25ea330" Apr 20 13:32:40.433747 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:32:40.433674 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"956471d92f2654f956f36cecb45fdd7f0f1608184cf67ee52f74509ac25ea330\": container with ID starting with 956471d92f2654f956f36cecb45fdd7f0f1608184cf67ee52f74509ac25ea330 not found: ID does not exist" containerID="956471d92f2654f956f36cecb45fdd7f0f1608184cf67ee52f74509ac25ea330" Apr 20 13:32:40.433747 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.433712 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"956471d92f2654f956f36cecb45fdd7f0f1608184cf67ee52f74509ac25ea330"} err="failed to get container status \"956471d92f2654f956f36cecb45fdd7f0f1608184cf67ee52f74509ac25ea330\": rpc error: code = NotFound desc = could not find container \"956471d92f2654f956f36cecb45fdd7f0f1608184cf67ee52f74509ac25ea330\": container with ID starting with 956471d92f2654f956f36cecb45fdd7f0f1608184cf67ee52f74509ac25ea330 not found: ID does not exist" Apr 20 13:32:40.437784 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.437753 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-image-registry-private-configuration\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:32:40.437784 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.437782 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gsbtj\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-kube-api-access-gsbtj\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:32:40.437955 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.437799 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-bound-sa-token\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:32:40.437955 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.437813 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-trusted-ca\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:32:40.437955 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.437827 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-certificates\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:32:40.437955 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.437842 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-installation-pull-secrets\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:32:40.437955 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.437856 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-registry-tls\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:32:40.437955 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.437869 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1314261e-ac1b-4ca4-8c20-11cf3ad0f281-ca-trust-extracted\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:32:40.438730 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.438709 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d4898ffc7-pmffs"] Apr 20 13:32:40.441301 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:32:40.441273 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d5ddbdf_c51d_41b4_aaf5_d0716097589c.slice/crio-3cbcef50dbd99a05a6aec97892c2e254926cd706fba14e013e80fd51767a426f WatchSource:0}: Error finding container 3cbcef50dbd99a05a6aec97892c2e254926cd706fba14e013e80fd51767a426f: Status 404 returned error can't find the container with id 3cbcef50dbd99a05a6aec97892c2e254926cd706fba14e013e80fd51767a426f Apr 20 13:32:40.446694 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.446645 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-b7824" podStartSLOduration=1.229985944 podStartE2EDuration="20.446629947s" podCreationTimestamp="2026-04-20 13:32:20 +0000 UTC" firstStartedPulling="2026-04-20 13:32:20.947088273 +0000 UTC m=+97.542779331" lastFinishedPulling="2026-04-20 13:32:40.163732265 +0000 UTC m=+116.759423334" observedRunningTime="2026-04-20 13:32:40.444508825 +0000 UTC m=+117.040199925" watchObservedRunningTime="2026-04-20 13:32:40.446629947 +0000 UTC m=+117.042321024" Apr 20 13:32:40.484525 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.482208 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7889fdc99c-s6dfk"] Apr 20 13:32:40.484525 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:40.484313 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7889fdc99c-s6dfk"] Apr 20 13:32:41.429326 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:41.429240 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d4898ffc7-pmffs" event={"ID":"0d5ddbdf-c51d-41b4-aaf5-d0716097589c","Type":"ContainerStarted","Data":"143ce3340876384ffaeb5b4c6f0b32c9e4bbaba6f8eaad1460eccab9f30f1a46"} Apr 20 13:32:41.429326 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:41.429284 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d4898ffc7-pmffs" event={"ID":"0d5ddbdf-c51d-41b4-aaf5-d0716097589c","Type":"ContainerStarted","Data":"3cbcef50dbd99a05a6aec97892c2e254926cd706fba14e013e80fd51767a426f"} Apr 20 13:32:41.429826 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:41.429672 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-b7824" Apr 20 13:32:41.446643 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:41.446611 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-b7824" Apr 20 13:32:41.452171 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:41.452063 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d4898ffc7-pmffs" podStartSLOduration=4.452044878 podStartE2EDuration="4.452044878s" podCreationTimestamp="2026-04-20 13:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:32:41.450750781 +0000 UTC m=+118.046441857" watchObservedRunningTime="2026-04-20 13:32:41.452044878 +0000 UTC m=+118.047735955" Apr 20 13:32:41.980455 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:41.980421 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1314261e-ac1b-4ca4-8c20-11cf3ad0f281" path="/var/lib/kubelet/pods/1314261e-ac1b-4ca4-8c20-11cf3ad0f281/volumes" Apr 20 13:32:42.587531 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:42.587497 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-vtms5_27fd6d03-d487-4763-a29e-c24f39dbeb32/cluster-monitoring-operator/0.log" Apr 20 13:32:43.784192 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:43.784162 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gx9q4_0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5/init-textfile/0.log" Apr 20 13:32:43.985650 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:43.985614 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gx9q4_0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5/node-exporter/0.log" Apr 20 13:32:44.184075 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:44.184041 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gx9q4_0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5/kube-rbac-proxy/0.log" Apr 20 13:32:47.450767 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:47.450719 2573 generic.go:358] "Generic (PLEG): container finished" podID="689d9ed8-d3dd-4b84-a93f-cc84672538b6" containerID="99f9925b12a209c564b097862958b555070ddf60a27afab91a693999b6305790" exitCode=0 Apr 20 13:32:47.451291 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:47.450796 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" event={"ID":"689d9ed8-d3dd-4b84-a93f-cc84672538b6","Type":"ContainerDied","Data":"99f9925b12a209c564b097862958b555070ddf60a27afab91a693999b6305790"} Apr 20 13:32:47.451291 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:47.451209 2573 scope.go:117] "RemoveContainer" containerID="99f9925b12a209c564b097862958b555070ddf60a27afab91a693999b6305790" Apr 20 13:32:48.134352 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:48.134310 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:48.134541 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:48.134371 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:48.140213 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:48.140179 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:48.455419 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:48.455329 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j5mhj" event={"ID":"689d9ed8-d3dd-4b84-a93f-cc84672538b6","Type":"ContainerStarted","Data":"b9c76be0e9b28cb88f70d98ab3627516d94728d2140a200685eaea2638e0cf0c"} Apr 20 13:32:48.460768 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:48.460740 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:32:48.536938 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:48.536903 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55bb49c67f-sj9bz"] Apr 20 13:32:49.384645 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:49.384612 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-b5jqm_65ad83cd-9d60-4721-b86e-87436c3f0696/networking-console-plugin/0.log" Apr 20 13:32:49.583606 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:49.583572 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 13:32:49.786801 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:49.786776 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/3.log" Apr 20 13:32:49.983974 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:49.983947 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55bb49c67f-sj9bz_41f3d115-5280-417c-90a7-e966833bc357/console/0.log" Apr 20 13:32:50.183754 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:50.183688 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d4898ffc7-pmffs_0d5ddbdf-c51d-41b4-aaf5-d0716097589c/console/0.log" Apr 20 13:32:50.387389 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:32:50.387360 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-b7824_248d400c-8b1e-42d8-a2ec-ba381005c2c7/download-server/0.log" Apr 20 13:33:13.562465 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:13.562394 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-55bb49c67f-sj9bz" podUID="41f3d115-5280-417c-90a7-e966833bc357" containerName="console" containerID="cri-o://7c1074c7810d4136ae3f7013f02752419aeb2514ed1c973923cf53c505711091" gracePeriod=15 Apr 20 13:33:13.831906 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:13.831884 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55bb49c67f-sj9bz_41f3d115-5280-417c-90a7-e966833bc357/console/0.log" Apr 20 13:33:13.832043 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:13.831944 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:33:14.023714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.023677 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-console-config\") pod \"41f3d115-5280-417c-90a7-e966833bc357\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " Apr 20 13:33:14.023892 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.023763 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-oauth-serving-cert\") pod \"41f3d115-5280-417c-90a7-e966833bc357\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " Apr 20 13:33:14.023892 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.023796 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smmck\" (UniqueName: \"kubernetes.io/projected/41f3d115-5280-417c-90a7-e966833bc357-kube-api-access-smmck\") pod \"41f3d115-5280-417c-90a7-e966833bc357\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " Apr 20 13:33:14.023892 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.023823 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/41f3d115-5280-417c-90a7-e966833bc357-console-serving-cert\") pod \"41f3d115-5280-417c-90a7-e966833bc357\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " Apr 20 13:33:14.023892 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.023848 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/41f3d115-5280-417c-90a7-e966833bc357-console-oauth-config\") pod \"41f3d115-5280-417c-90a7-e966833bc357\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " Apr 20 13:33:14.023892 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.023875 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-service-ca\") pod \"41f3d115-5280-417c-90a7-e966833bc357\" (UID: \"41f3d115-5280-417c-90a7-e966833bc357\") " Apr 20 13:33:14.024201 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.024178 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-console-config" (OuterVolumeSpecName: "console-config") pod "41f3d115-5280-417c-90a7-e966833bc357" (UID: "41f3d115-5280-417c-90a7-e966833bc357"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:33:14.024352 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.024322 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "41f3d115-5280-417c-90a7-e966833bc357" (UID: "41f3d115-5280-417c-90a7-e966833bc357"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:33:14.024450 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.024385 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-service-ca" (OuterVolumeSpecName: "service-ca") pod "41f3d115-5280-417c-90a7-e966833bc357" (UID: "41f3d115-5280-417c-90a7-e966833bc357"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:33:14.026227 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.026202 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f3d115-5280-417c-90a7-e966833bc357-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "41f3d115-5280-417c-90a7-e966833bc357" (UID: "41f3d115-5280-417c-90a7-e966833bc357"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:14.026303 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.026232 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f3d115-5280-417c-90a7-e966833bc357-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "41f3d115-5280-417c-90a7-e966833bc357" (UID: "41f3d115-5280-417c-90a7-e966833bc357"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:14.026353 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.026336 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f3d115-5280-417c-90a7-e966833bc357-kube-api-access-smmck" (OuterVolumeSpecName: "kube-api-access-smmck") pod "41f3d115-5280-417c-90a7-e966833bc357" (UID: "41f3d115-5280-417c-90a7-e966833bc357"). InnerVolumeSpecName "kube-api-access-smmck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:33:14.125010 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.124917 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-oauth-serving-cert\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:33:14.125010 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.124950 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-smmck\" (UniqueName: \"kubernetes.io/projected/41f3d115-5280-417c-90a7-e966833bc357-kube-api-access-smmck\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:33:14.125010 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.124966 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/41f3d115-5280-417c-90a7-e966833bc357-console-serving-cert\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:33:14.125010 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.124977 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/41f3d115-5280-417c-90a7-e966833bc357-console-oauth-config\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:33:14.125010 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.124990 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-service-ca\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:33:14.125010 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.125002 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/41f3d115-5280-417c-90a7-e966833bc357-console-config\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:33:14.530877 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.530801 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55bb49c67f-sj9bz_41f3d115-5280-417c-90a7-e966833bc357/console/0.log" Apr 20 13:33:14.530877 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.530840 2573 generic.go:358] "Generic (PLEG): container finished" podID="41f3d115-5280-417c-90a7-e966833bc357" containerID="7c1074c7810d4136ae3f7013f02752419aeb2514ed1c973923cf53c505711091" exitCode=2 Apr 20 13:33:14.531067 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.530874 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55bb49c67f-sj9bz" event={"ID":"41f3d115-5280-417c-90a7-e966833bc357","Type":"ContainerDied","Data":"7c1074c7810d4136ae3f7013f02752419aeb2514ed1c973923cf53c505711091"} Apr 20 13:33:14.531067 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.530916 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55bb49c67f-sj9bz" event={"ID":"41f3d115-5280-417c-90a7-e966833bc357","Type":"ContainerDied","Data":"03eb474a28d2f0f9c2c9d965ab48dcf2e744fc9d070c9b6bbf5959f633435b0a"} Apr 20 13:33:14.531067 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.530925 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55bb49c67f-sj9bz" Apr 20 13:33:14.531067 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.530931 2573 scope.go:117] "RemoveContainer" containerID="7c1074c7810d4136ae3f7013f02752419aeb2514ed1c973923cf53c505711091" Apr 20 13:33:14.538454 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.538438 2573 scope.go:117] "RemoveContainer" containerID="7c1074c7810d4136ae3f7013f02752419aeb2514ed1c973923cf53c505711091" Apr 20 13:33:14.538713 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:33:14.538685 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c1074c7810d4136ae3f7013f02752419aeb2514ed1c973923cf53c505711091\": container with ID starting with 7c1074c7810d4136ae3f7013f02752419aeb2514ed1c973923cf53c505711091 not found: ID does not exist" containerID="7c1074c7810d4136ae3f7013f02752419aeb2514ed1c973923cf53c505711091" Apr 20 13:33:14.538775 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.538712 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c1074c7810d4136ae3f7013f02752419aeb2514ed1c973923cf53c505711091"} err="failed to get container status \"7c1074c7810d4136ae3f7013f02752419aeb2514ed1c973923cf53c505711091\": rpc error: code = NotFound desc = could not find container \"7c1074c7810d4136ae3f7013f02752419aeb2514ed1c973923cf53c505711091\": container with ID starting with 7c1074c7810d4136ae3f7013f02752419aeb2514ed1c973923cf53c505711091 not found: ID does not exist" Apr 20 13:33:14.557074 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.557056 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55bb49c67f-sj9bz"] Apr 20 13:33:14.571846 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:14.571818 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55bb49c67f-sj9bz"] Apr 20 13:33:15.979536 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:15.979504 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f3d115-5280-417c-90a7-e966833bc357" path="/var/lib/kubelet/pods/41f3d115-5280-417c-90a7-e966833bc357/volumes" Apr 20 13:33:38.400620 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.400529 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-658b695877-rchdh"] Apr 20 13:33:38.401166 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.400809 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41f3d115-5280-417c-90a7-e966833bc357" containerName="console" Apr 20 13:33:38.401166 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.400822 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f3d115-5280-417c-90a7-e966833bc357" containerName="console" Apr 20 13:33:38.401166 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.400835 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1314261e-ac1b-4ca4-8c20-11cf3ad0f281" containerName="registry" Apr 20 13:33:38.401166 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.400840 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1314261e-ac1b-4ca4-8c20-11cf3ad0f281" containerName="registry" Apr 20 13:33:38.401166 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.400884 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1314261e-ac1b-4ca4-8c20-11cf3ad0f281" containerName="registry" Apr 20 13:33:38.401166 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.400896 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="41f3d115-5280-417c-90a7-e966833bc357" containerName="console" Apr 20 13:33:38.403660 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.403638 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.421778 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.421746 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-658b695877-rchdh"] Apr 20 13:33:38.511052 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.511013 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-serving-cert\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.511052 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.511052 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-oauth-serving-cert\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.511256 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.511089 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-oauth-config\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.511256 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.511121 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-service-ca\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.511256 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.511174 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-trusted-ca-bundle\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.511256 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.511193 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs9jf\" (UniqueName: \"kubernetes.io/projected/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-kube-api-access-cs9jf\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.511256 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.511220 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-config\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.612483 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.612451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-trusted-ca-bundle\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.612483 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.612489 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cs9jf\" (UniqueName: \"kubernetes.io/projected/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-kube-api-access-cs9jf\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.612709 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.612519 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-config\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.612709 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.612558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-serving-cert\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.612709 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.612604 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-oauth-serving-cert\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.612709 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.612637 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-oauth-config\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.612709 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.612671 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-service-ca\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.613366 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.613337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-config\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.613488 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.613438 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-trusted-ca-bundle\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.613488 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.613446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-oauth-serving-cert\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.613488 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.613459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-service-ca\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.615165 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.615111 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-serving-cert\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.615165 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.615152 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-oauth-config\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.621273 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.621252 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs9jf\" (UniqueName: \"kubernetes.io/projected/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-kube-api-access-cs9jf\") pod \"console-658b695877-rchdh\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.712197 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.712102 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:38.840951 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:38.840921 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-658b695877-rchdh"] Apr 20 13:33:38.843924 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:33:38.843894 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e7cd4c_6c7e_4edc_95a6_f5e0d1a477b1.slice/crio-738fc08a3809a843557eda2622c51e3d98db5fec1152663f4c59141f731228bf WatchSource:0}: Error finding container 738fc08a3809a843557eda2622c51e3d98db5fec1152663f4c59141f731228bf: Status 404 returned error can't find the container with id 738fc08a3809a843557eda2622c51e3d98db5fec1152663f4c59141f731228bf Apr 20 13:33:39.600297 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:39.600261 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-658b695877-rchdh" event={"ID":"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1","Type":"ContainerStarted","Data":"d825c8c5ec83fcd4183346809c6e5b21f032e6672e4054d01dfd2b1188a23ba3"} Apr 20 13:33:39.600297 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:39.600297 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-658b695877-rchdh" event={"ID":"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1","Type":"ContainerStarted","Data":"738fc08a3809a843557eda2622c51e3d98db5fec1152663f4c59141f731228bf"} Apr 20 13:33:39.622623 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:39.622581 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-658b695877-rchdh" podStartSLOduration=1.622567627 podStartE2EDuration="1.622567627s" podCreationTimestamp="2026-04-20 13:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:33:39.621787162 +0000 UTC m=+176.217478238" watchObservedRunningTime="2026-04-20 13:33:39.622567627 +0000 UTC m=+176.218258703" Apr 20 13:33:48.713084 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:48.713052 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:48.713582 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:48.713211 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:48.717726 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:48.717700 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:49.631517 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:49.631485 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-658b695877-rchdh" Apr 20 13:33:49.695615 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:33:49.695583 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d4898ffc7-pmffs"] Apr 20 13:34:14.716660 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:14.716597 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7d4898ffc7-pmffs" podUID="0d5ddbdf-c51d-41b4-aaf5-d0716097589c" containerName="console" containerID="cri-o://143ce3340876384ffaeb5b4c6f0b32c9e4bbaba6f8eaad1460eccab9f30f1a46" gracePeriod=15 Apr 20 13:34:14.948155 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:14.948125 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d4898ffc7-pmffs_0d5ddbdf-c51d-41b4-aaf5-d0716097589c/console/0.log" Apr 20 13:34:14.948258 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:14.948200 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:34:15.088885 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.088857 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-serving-cert\") pod \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " Apr 20 13:34:15.089057 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.088896 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-oauth-config\") pod \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " Apr 20 13:34:15.089057 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.088927 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-config\") pod \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " Apr 20 13:34:15.089057 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.088954 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxgl4\" (UniqueName: \"kubernetes.io/projected/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-kube-api-access-pxgl4\") pod \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " Apr 20 13:34:15.089057 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.088981 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-oauth-serving-cert\") pod \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " Apr 20 13:34:15.089057 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.088998 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-service-ca\") pod \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " Apr 20 13:34:15.089057 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.089023 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-trusted-ca-bundle\") pod \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\" (UID: \"0d5ddbdf-c51d-41b4-aaf5-d0716097589c\") " Apr 20 13:34:15.089585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.089447 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-config" (OuterVolumeSpecName: "console-config") pod "0d5ddbdf-c51d-41b4-aaf5-d0716097589c" (UID: "0d5ddbdf-c51d-41b4-aaf5-d0716097589c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:34:15.089585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.089479 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0d5ddbdf-c51d-41b4-aaf5-d0716097589c" (UID: "0d5ddbdf-c51d-41b4-aaf5-d0716097589c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:34:15.089585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.089532 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-service-ca" (OuterVolumeSpecName: "service-ca") pod "0d5ddbdf-c51d-41b4-aaf5-d0716097589c" (UID: "0d5ddbdf-c51d-41b4-aaf5-d0716097589c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:34:15.089770 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.089707 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0d5ddbdf-c51d-41b4-aaf5-d0716097589c" (UID: "0d5ddbdf-c51d-41b4-aaf5-d0716097589c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:34:15.091321 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.091300 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-kube-api-access-pxgl4" (OuterVolumeSpecName: "kube-api-access-pxgl4") pod "0d5ddbdf-c51d-41b4-aaf5-d0716097589c" (UID: "0d5ddbdf-c51d-41b4-aaf5-d0716097589c"). InnerVolumeSpecName "kube-api-access-pxgl4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:34:15.091524 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.091508 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0d5ddbdf-c51d-41b4-aaf5-d0716097589c" (UID: "0d5ddbdf-c51d-41b4-aaf5-d0716097589c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:34:15.091588 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.091550 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0d5ddbdf-c51d-41b4-aaf5-d0716097589c" (UID: "0d5ddbdf-c51d-41b4-aaf5-d0716097589c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:34:15.190344 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.190307 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-serving-cert\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:34:15.190344 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.190338 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-oauth-config\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:34:15.190344 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.190352 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-console-config\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:34:15.190581 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.190366 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pxgl4\" (UniqueName: \"kubernetes.io/projected/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-kube-api-access-pxgl4\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:34:15.190581 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.190378 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-oauth-serving-cert\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:34:15.190581 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.190391 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-service-ca\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:34:15.190581 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.190403 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d5ddbdf-c51d-41b4-aaf5-d0716097589c-trusted-ca-bundle\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:34:15.696691 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.696666 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d4898ffc7-pmffs_0d5ddbdf-c51d-41b4-aaf5-d0716097589c/console/0.log" Apr 20 13:34:15.696862 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.696705 2573 generic.go:358] "Generic (PLEG): container finished" podID="0d5ddbdf-c51d-41b4-aaf5-d0716097589c" containerID="143ce3340876384ffaeb5b4c6f0b32c9e4bbaba6f8eaad1460eccab9f30f1a46" exitCode=2 Apr 20 13:34:15.696862 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.696750 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d4898ffc7-pmffs" event={"ID":"0d5ddbdf-c51d-41b4-aaf5-d0716097589c","Type":"ContainerDied","Data":"143ce3340876384ffaeb5b4c6f0b32c9e4bbaba6f8eaad1460eccab9f30f1a46"} Apr 20 13:34:15.696862 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.696764 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d4898ffc7-pmffs" Apr 20 13:34:15.696862 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.696780 2573 scope.go:117] "RemoveContainer" containerID="143ce3340876384ffaeb5b4c6f0b32c9e4bbaba6f8eaad1460eccab9f30f1a46" Apr 20 13:34:15.696862 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.696771 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d4898ffc7-pmffs" event={"ID":"0d5ddbdf-c51d-41b4-aaf5-d0716097589c","Type":"ContainerDied","Data":"3cbcef50dbd99a05a6aec97892c2e254926cd706fba14e013e80fd51767a426f"} Apr 20 13:34:15.704888 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.704870 2573 scope.go:117] "RemoveContainer" containerID="143ce3340876384ffaeb5b4c6f0b32c9e4bbaba6f8eaad1460eccab9f30f1a46" Apr 20 13:34:15.705183 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:34:15.705151 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"143ce3340876384ffaeb5b4c6f0b32c9e4bbaba6f8eaad1460eccab9f30f1a46\": container with ID starting with 143ce3340876384ffaeb5b4c6f0b32c9e4bbaba6f8eaad1460eccab9f30f1a46 not found: ID does not exist" containerID="143ce3340876384ffaeb5b4c6f0b32c9e4bbaba6f8eaad1460eccab9f30f1a46" Apr 20 13:34:15.705270 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.705182 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"143ce3340876384ffaeb5b4c6f0b32c9e4bbaba6f8eaad1460eccab9f30f1a46"} err="failed to get container status \"143ce3340876384ffaeb5b4c6f0b32c9e4bbaba6f8eaad1460eccab9f30f1a46\": rpc error: code = NotFound desc = could not find container \"143ce3340876384ffaeb5b4c6f0b32c9e4bbaba6f8eaad1460eccab9f30f1a46\": container with ID starting with 143ce3340876384ffaeb5b4c6f0b32c9e4bbaba6f8eaad1460eccab9f30f1a46 not found: ID does not exist" Apr 20 13:34:15.736705 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.736679 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d4898ffc7-pmffs"] Apr 20 13:34:15.741887 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.741860 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d4898ffc7-pmffs"] Apr 20 13:34:15.979555 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:15.979481 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d5ddbdf-c51d-41b4-aaf5-d0716097589c" path="/var/lib/kubelet/pods/0d5ddbdf-c51d-41b4-aaf5-d0716097589c/volumes" Apr 20 13:34:59.109230 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.109200 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59bd67467f-6rb9f"] Apr 20 13:34:59.109711 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.109539 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d5ddbdf-c51d-41b4-aaf5-d0716097589c" containerName="console" Apr 20 13:34:59.109711 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.109554 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d5ddbdf-c51d-41b4-aaf5-d0716097589c" containerName="console" Apr 20 13:34:59.109711 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.109614 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d5ddbdf-c51d-41b4-aaf5-d0716097589c" containerName="console" Apr 20 13:34:59.113647 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.113626 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.123455 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.123432 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59bd67467f-6rb9f"] Apr 20 13:34:59.293890 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.293855 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-service-ca\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.293890 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.293891 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-oauth-serving-cert\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.294089 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.293938 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqbkt\" (UniqueName: \"kubernetes.io/projected/a40190b1-f8f5-4c0a-9267-a3f911eae204-kube-api-access-nqbkt\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.294089 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.293962 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-oauth-config\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.294089 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.293986 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-config\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.294089 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.294025 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-trusted-ca-bundle\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.294089 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.294046 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-serving-cert\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.394597 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.394527 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqbkt\" (UniqueName: \"kubernetes.io/projected/a40190b1-f8f5-4c0a-9267-a3f911eae204-kube-api-access-nqbkt\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.394597 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.394564 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-oauth-config\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.394597 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.394584 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-config\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.394768 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.394609 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-trusted-ca-bundle\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.394768 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.394663 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-serving-cert\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.394768 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.394724 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-service-ca\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.394768 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.394741 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-oauth-serving-cert\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.395455 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.395433 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-oauth-serving-cert\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.395576 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.395499 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-config\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.395576 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.395499 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-service-ca\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.395576 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.395541 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-trusted-ca-bundle\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.397084 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.397061 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-serving-cert\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.397196 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.397181 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-oauth-config\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.402696 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.402668 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqbkt\" (UniqueName: \"kubernetes.io/projected/a40190b1-f8f5-4c0a-9267-a3f911eae204-kube-api-access-nqbkt\") pod \"console-59bd67467f-6rb9f\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.424485 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.424462 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:34:59.539443 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.539421 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59bd67467f-6rb9f"] Apr 20 13:34:59.541992 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:34:59.541964 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda40190b1_f8f5_4c0a_9267_a3f911eae204.slice/crio-d6af28a5ba0efd100783ead1e815ee418e3a2e1ca3fd32dc3ee94a44373ac5a9 WatchSource:0}: Error finding container d6af28a5ba0efd100783ead1e815ee418e3a2e1ca3fd32dc3ee94a44373ac5a9: Status 404 returned error can't find the container with id d6af28a5ba0efd100783ead1e815ee418e3a2e1ca3fd32dc3ee94a44373ac5a9 Apr 20 13:34:59.818691 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.818657 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59bd67467f-6rb9f" event={"ID":"a40190b1-f8f5-4c0a-9267-a3f911eae204","Type":"ContainerStarted","Data":"233389f51b78118cb18b28235b7c292c2a9754956f69ed794da4fdde0203b33d"} Apr 20 13:34:59.818691 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.818692 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59bd67467f-6rb9f" event={"ID":"a40190b1-f8f5-4c0a-9267-a3f911eae204","Type":"ContainerStarted","Data":"d6af28a5ba0efd100783ead1e815ee418e3a2e1ca3fd32dc3ee94a44373ac5a9"} Apr 20 13:34:59.838316 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:34:59.838266 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59bd67467f-6rb9f" podStartSLOduration=0.838251683 podStartE2EDuration="838.251683ms" podCreationTimestamp="2026-04-20 13:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:34:59.836824921 +0000 UTC m=+256.432516009" watchObservedRunningTime="2026-04-20 13:34:59.838251683 +0000 UTC m=+256.433942759" Apr 20 13:35:09.425463 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:09.425373 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:35:09.425463 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:09.425413 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:35:09.430004 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:09.429976 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:35:09.848685 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:09.848660 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:35:09.919877 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:09.919842 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-658b695877-rchdh"] Apr 20 13:35:34.941508 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:34.941445 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-658b695877-rchdh" podUID="88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1" containerName="console" containerID="cri-o://d825c8c5ec83fcd4183346809c6e5b21f032e6672e4054d01dfd2b1188a23ba3" gracePeriod=15 Apr 20 13:35:35.179874 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.179850 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-658b695877-rchdh_88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1/console/0.log" Apr 20 13:35:35.179991 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.179915 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-658b695877-rchdh" Apr 20 13:35:35.258909 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.258824 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-trusted-ca-bundle\") pod \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " Apr 20 13:35:35.258909 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.258872 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-oauth-serving-cert\") pod \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " Apr 20 13:35:35.258909 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.258890 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-oauth-config\") pod \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " Apr 20 13:35:35.258909 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.258908 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-serving-cert\") pod \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " Apr 20 13:35:35.259251 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.258940 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs9jf\" (UniqueName: \"kubernetes.io/projected/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-kube-api-access-cs9jf\") pod \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " Apr 20 13:35:35.259251 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.258965 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-config\") pod \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " Apr 20 13:35:35.259251 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.259008 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-service-ca\") pod \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\" (UID: \"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1\") " Apr 20 13:35:35.259501 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.259473 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1" (UID: "88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:35:35.259567 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.259487 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-config" (OuterVolumeSpecName: "console-config") pod "88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1" (UID: "88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:35:35.259567 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.259496 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1" (UID: "88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:35:35.259567 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.259479 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-service-ca" (OuterVolumeSpecName: "service-ca") pod "88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1" (UID: "88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:35:35.261067 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.261045 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1" (UID: "88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:35:35.261451 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.261433 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-kube-api-access-cs9jf" (OuterVolumeSpecName: "kube-api-access-cs9jf") pod "88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1" (UID: "88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1"). InnerVolumeSpecName "kube-api-access-cs9jf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:35:35.261501 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.261445 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1" (UID: "88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:35:35.359492 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.359464 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-trusted-ca-bundle\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:35:35.359492 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.359489 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-oauth-serving-cert\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:35:35.359492 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.359498 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-oauth-config\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:35:35.359756 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.359506 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-serving-cert\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:35:35.359756 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.359515 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cs9jf\" (UniqueName: \"kubernetes.io/projected/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-kube-api-access-cs9jf\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:35:35.359756 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.359523 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-console-config\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:35:35.359756 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.359532 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1-service-ca\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:35:35.913947 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.913915 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-658b695877-rchdh_88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1/console/0.log" Apr 20 13:35:35.914115 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.913955 2573 generic.go:358] "Generic (PLEG): container finished" podID="88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1" containerID="d825c8c5ec83fcd4183346809c6e5b21f032e6672e4054d01dfd2b1188a23ba3" exitCode=2 Apr 20 13:35:35.914115 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.914022 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-658b695877-rchdh" event={"ID":"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1","Type":"ContainerDied","Data":"d825c8c5ec83fcd4183346809c6e5b21f032e6672e4054d01dfd2b1188a23ba3"} Apr 20 13:35:35.914115 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.914048 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-658b695877-rchdh" event={"ID":"88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1","Type":"ContainerDied","Data":"738fc08a3809a843557eda2622c51e3d98db5fec1152663f4c59141f731228bf"} Apr 20 13:35:35.914115 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.914049 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-658b695877-rchdh" Apr 20 13:35:35.914115 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.914062 2573 scope.go:117] "RemoveContainer" containerID="d825c8c5ec83fcd4183346809c6e5b21f032e6672e4054d01dfd2b1188a23ba3" Apr 20 13:35:35.922229 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.922212 2573 scope.go:117] "RemoveContainer" containerID="d825c8c5ec83fcd4183346809c6e5b21f032e6672e4054d01dfd2b1188a23ba3" Apr 20 13:35:35.922524 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:35:35.922504 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d825c8c5ec83fcd4183346809c6e5b21f032e6672e4054d01dfd2b1188a23ba3\": container with ID starting with d825c8c5ec83fcd4183346809c6e5b21f032e6672e4054d01dfd2b1188a23ba3 not found: ID does not exist" containerID="d825c8c5ec83fcd4183346809c6e5b21f032e6672e4054d01dfd2b1188a23ba3" Apr 20 13:35:35.922585 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.922534 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d825c8c5ec83fcd4183346809c6e5b21f032e6672e4054d01dfd2b1188a23ba3"} err="failed to get container status \"d825c8c5ec83fcd4183346809c6e5b21f032e6672e4054d01dfd2b1188a23ba3\": rpc error: code = NotFound desc = could not find container \"d825c8c5ec83fcd4183346809c6e5b21f032e6672e4054d01dfd2b1188a23ba3\": container with ID starting with d825c8c5ec83fcd4183346809c6e5b21f032e6672e4054d01dfd2b1188a23ba3 not found: ID does not exist" Apr 20 13:35:35.934351 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.934326 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-658b695877-rchdh"] Apr 20 13:35:35.938049 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.938029 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-658b695877-rchdh"] Apr 20 13:35:35.980035 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:35.980010 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1" path="/var/lib/kubelet/pods/88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1/volumes" Apr 20 13:35:43.879804 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:43.879776 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 13:35:43.880287 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:43.879959 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 13:35:43.882599 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:43.882581 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 13:35:43.882711 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:43.882586 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 13:35:43.889071 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:35:43.889054 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 13:36:11.721672 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.721629 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2"] Apr 20 13:36:11.724117 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.722042 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1" containerName="console" Apr 20 13:36:11.724117 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.722060 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1" containerName="console" Apr 20 13:36:11.724117 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.722117 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="88e7cd4c-6c7e-4edc-95a6-f5e0d1a477b1" containerName="console" Apr 20 13:36:11.724962 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.724945 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" Apr 20 13:36:11.727433 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.727410 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 13:36:11.727549 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.727450 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 13:36:11.727614 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.727556 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bsw7x\"" Apr 20 13:36:11.735230 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.735210 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2"] Apr 20 13:36:11.830763 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.830728 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kqdq\" (UniqueName: \"kubernetes.io/projected/c0c8d1f8-e966-454f-bd4e-568ce477830b-kube-api-access-8kqdq\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2\" (UID: \"c0c8d1f8-e966-454f-bd4e-568ce477830b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" Apr 20 13:36:11.830931 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.830777 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0c8d1f8-e966-454f-bd4e-568ce477830b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2\" (UID: \"c0c8d1f8-e966-454f-bd4e-568ce477830b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" Apr 20 13:36:11.830931 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.830896 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0c8d1f8-e966-454f-bd4e-568ce477830b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2\" (UID: \"c0c8d1f8-e966-454f-bd4e-568ce477830b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" Apr 20 13:36:11.931872 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.931838 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kqdq\" (UniqueName: \"kubernetes.io/projected/c0c8d1f8-e966-454f-bd4e-568ce477830b-kube-api-access-8kqdq\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2\" (UID: \"c0c8d1f8-e966-454f-bd4e-568ce477830b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" Apr 20 13:36:11.932065 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.931880 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0c8d1f8-e966-454f-bd4e-568ce477830b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2\" (UID: \"c0c8d1f8-e966-454f-bd4e-568ce477830b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" Apr 20 13:36:11.932065 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.931932 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0c8d1f8-e966-454f-bd4e-568ce477830b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2\" (UID: \"c0c8d1f8-e966-454f-bd4e-568ce477830b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" Apr 20 13:36:11.932293 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.932274 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0c8d1f8-e966-454f-bd4e-568ce477830b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2\" (UID: \"c0c8d1f8-e966-454f-bd4e-568ce477830b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" Apr 20 13:36:11.932379 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.932323 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0c8d1f8-e966-454f-bd4e-568ce477830b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2\" (UID: \"c0c8d1f8-e966-454f-bd4e-568ce477830b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" Apr 20 13:36:11.940575 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:11.940555 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kqdq\" (UniqueName: \"kubernetes.io/projected/c0c8d1f8-e966-454f-bd4e-568ce477830b-kube-api-access-8kqdq\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2\" (UID: \"c0c8d1f8-e966-454f-bd4e-568ce477830b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" Apr 20 13:36:12.033435 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:12.033351 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" Apr 20 13:36:12.155603 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:12.155570 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2"] Apr 20 13:36:12.159425 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:36:12.159399 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0c8d1f8_e966_454f_bd4e_568ce477830b.slice/crio-1fba5a24728009306d9494fd75de9848d311eb21d77cdc6618769f742de71761 WatchSource:0}: Error finding container 1fba5a24728009306d9494fd75de9848d311eb21d77cdc6618769f742de71761: Status 404 returned error can't find the container with id 1fba5a24728009306d9494fd75de9848d311eb21d77cdc6618769f742de71761 Apr 20 13:36:12.161099 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:12.161083 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 13:36:13.011112 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:13.011073 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" event={"ID":"c0c8d1f8-e966-454f-bd4e-568ce477830b","Type":"ContainerStarted","Data":"1fba5a24728009306d9494fd75de9848d311eb21d77cdc6618769f742de71761"} Apr 20 13:36:18.025643 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:18.025613 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" event={"ID":"c0c8d1f8-e966-454f-bd4e-568ce477830b","Type":"ContainerStarted","Data":"2d141f0cebcbf138ebecd81e9e268a23d0a1469f026d964a51cde8bef3aaff24"} Apr 20 13:36:19.029378 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:19.029347 2573 generic.go:358] "Generic (PLEG): container finished" podID="c0c8d1f8-e966-454f-bd4e-568ce477830b" containerID="2d141f0cebcbf138ebecd81e9e268a23d0a1469f026d964a51cde8bef3aaff24" exitCode=0 Apr 20 13:36:19.029741 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:19.029446 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" event={"ID":"c0c8d1f8-e966-454f-bd4e-568ce477830b","Type":"ContainerDied","Data":"2d141f0cebcbf138ebecd81e9e268a23d0a1469f026d964a51cde8bef3aaff24"} Apr 20 13:36:21.037176 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:21.037124 2573 generic.go:358] "Generic (PLEG): container finished" podID="c0c8d1f8-e966-454f-bd4e-568ce477830b" containerID="307b67939f578173e994192bb20755760969b1c397c1a1f4e7ca089f75fc1c05" exitCode=0 Apr 20 13:36:21.037545 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:21.037185 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" event={"ID":"c0c8d1f8-e966-454f-bd4e-568ce477830b","Type":"ContainerDied","Data":"307b67939f578173e994192bb20755760969b1c397c1a1f4e7ca089f75fc1c05"} Apr 20 13:36:28.060003 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:28.059967 2573 generic.go:358] "Generic (PLEG): container finished" podID="c0c8d1f8-e966-454f-bd4e-568ce477830b" containerID="0f781265b47bfa4a655db4356f259982c2e1665cb41ef8e4ecd988d04ae71459" exitCode=0 Apr 20 13:36:28.060422 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:28.060040 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" event={"ID":"c0c8d1f8-e966-454f-bd4e-568ce477830b","Type":"ContainerDied","Data":"0f781265b47bfa4a655db4356f259982c2e1665cb41ef8e4ecd988d04ae71459"} Apr 20 13:36:29.175678 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:29.175657 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" Apr 20 13:36:29.262691 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:29.262663 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kqdq\" (UniqueName: \"kubernetes.io/projected/c0c8d1f8-e966-454f-bd4e-568ce477830b-kube-api-access-8kqdq\") pod \"c0c8d1f8-e966-454f-bd4e-568ce477830b\" (UID: \"c0c8d1f8-e966-454f-bd4e-568ce477830b\") " Apr 20 13:36:29.262845 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:29.262718 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0c8d1f8-e966-454f-bd4e-568ce477830b-bundle\") pod \"c0c8d1f8-e966-454f-bd4e-568ce477830b\" (UID: \"c0c8d1f8-e966-454f-bd4e-568ce477830b\") " Apr 20 13:36:29.262845 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:29.262759 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0c8d1f8-e966-454f-bd4e-568ce477830b-util\") pod \"c0c8d1f8-e966-454f-bd4e-568ce477830b\" (UID: \"c0c8d1f8-e966-454f-bd4e-568ce477830b\") " Apr 20 13:36:29.263299 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:29.263267 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0c8d1f8-e966-454f-bd4e-568ce477830b-bundle" (OuterVolumeSpecName: "bundle") pod "c0c8d1f8-e966-454f-bd4e-568ce477830b" (UID: "c0c8d1f8-e966-454f-bd4e-568ce477830b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:36:29.264890 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:29.264868 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0c8d1f8-e966-454f-bd4e-568ce477830b-kube-api-access-8kqdq" (OuterVolumeSpecName: "kube-api-access-8kqdq") pod "c0c8d1f8-e966-454f-bd4e-568ce477830b" (UID: "c0c8d1f8-e966-454f-bd4e-568ce477830b"). InnerVolumeSpecName "kube-api-access-8kqdq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:36:29.266840 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:29.266820 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0c8d1f8-e966-454f-bd4e-568ce477830b-util" (OuterVolumeSpecName: "util") pod "c0c8d1f8-e966-454f-bd4e-568ce477830b" (UID: "c0c8d1f8-e966-454f-bd4e-568ce477830b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:36:29.363621 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:29.363549 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0c8d1f8-e966-454f-bd4e-568ce477830b-util\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:36:29.363621 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:29.363577 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8kqdq\" (UniqueName: \"kubernetes.io/projected/c0c8d1f8-e966-454f-bd4e-568ce477830b-kube-api-access-8kqdq\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:36:29.363621 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:29.363588 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0c8d1f8-e966-454f-bd4e-568ce477830b-bundle\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:36:30.066964 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:30.066930 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" event={"ID":"c0c8d1f8-e966-454f-bd4e-568ce477830b","Type":"ContainerDied","Data":"1fba5a24728009306d9494fd75de9848d311eb21d77cdc6618769f742de71761"} Apr 20 13:36:30.066964 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:30.066967 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fba5a24728009306d9494fd75de9848d311eb21d77cdc6618769f742de71761" Apr 20 13:36:30.067189 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:30.067030 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55pml2" Apr 20 13:36:34.121561 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.121479 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-p6m2w"] Apr 20 13:36:34.122028 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.121770 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0c8d1f8-e966-454f-bd4e-568ce477830b" containerName="extract" Apr 20 13:36:34.122028 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.121782 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c8d1f8-e966-454f-bd4e-568ce477830b" containerName="extract" Apr 20 13:36:34.122028 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.121798 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0c8d1f8-e966-454f-bd4e-568ce477830b" containerName="util" Apr 20 13:36:34.122028 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.121804 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c8d1f8-e966-454f-bd4e-568ce477830b" containerName="util" Apr 20 13:36:34.122028 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.121812 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0c8d1f8-e966-454f-bd4e-568ce477830b" containerName="pull" Apr 20 13:36:34.122028 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.121819 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c8d1f8-e966-454f-bd4e-568ce477830b" containerName="pull" Apr 20 13:36:34.122028 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.121868 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0c8d1f8-e966-454f-bd4e-568ce477830b" containerName="extract" Apr 20 13:36:34.142094 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.142065 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-p6m2w"] Apr 20 13:36:34.142284 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.142215 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-p6m2w" Apr 20 13:36:34.144564 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.144540 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 13:36:34.144564 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.144553 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 13:36:34.144761 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.144553 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-7fqjg\"" Apr 20 13:36:34.197551 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.197517 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-784ns\" (UniqueName: \"kubernetes.io/projected/ba553775-3f37-407a-84bb-c94cae5b2805-kube-api-access-784ns\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-p6m2w\" (UID: \"ba553775-3f37-407a-84bb-c94cae5b2805\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-p6m2w" Apr 20 13:36:34.197729 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.197562 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ba553775-3f37-407a-84bb-c94cae5b2805-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-p6m2w\" (UID: \"ba553775-3f37-407a-84bb-c94cae5b2805\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-p6m2w" Apr 20 13:36:34.297994 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.297959 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-784ns\" (UniqueName: \"kubernetes.io/projected/ba553775-3f37-407a-84bb-c94cae5b2805-kube-api-access-784ns\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-p6m2w\" (UID: \"ba553775-3f37-407a-84bb-c94cae5b2805\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-p6m2w" Apr 20 13:36:34.298113 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.298010 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ba553775-3f37-407a-84bb-c94cae5b2805-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-p6m2w\" (UID: \"ba553775-3f37-407a-84bb-c94cae5b2805\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-p6m2w" Apr 20 13:36:34.298423 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.298404 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ba553775-3f37-407a-84bb-c94cae5b2805-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-p6m2w\" (UID: \"ba553775-3f37-407a-84bb-c94cae5b2805\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-p6m2w" Apr 20 13:36:34.306667 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.306647 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-784ns\" (UniqueName: \"kubernetes.io/projected/ba553775-3f37-407a-84bb-c94cae5b2805-kube-api-access-784ns\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-p6m2w\" (UID: \"ba553775-3f37-407a-84bb-c94cae5b2805\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-p6m2w" Apr 20 13:36:34.451680 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.451592 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-p6m2w" Apr 20 13:36:34.573550 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:34.573517 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-p6m2w"] Apr 20 13:36:34.577455 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:36:34.577428 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba553775_3f37_407a_84bb_c94cae5b2805.slice/crio-f8f01eb9c4de5c4c2a500f29dd861de3ba61c4da6ee1a17e906affc27f5a8204 WatchSource:0}: Error finding container f8f01eb9c4de5c4c2a500f29dd861de3ba61c4da6ee1a17e906affc27f5a8204: Status 404 returned error can't find the container with id f8f01eb9c4de5c4c2a500f29dd861de3ba61c4da6ee1a17e906affc27f5a8204 Apr 20 13:36:35.081237 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:35.081206 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-p6m2w" event={"ID":"ba553775-3f37-407a-84bb-c94cae5b2805","Type":"ContainerStarted","Data":"f8f01eb9c4de5c4c2a500f29dd861de3ba61c4da6ee1a17e906affc27f5a8204"} Apr 20 13:36:37.088514 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:37.088478 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-p6m2w" event={"ID":"ba553775-3f37-407a-84bb-c94cae5b2805","Type":"ContainerStarted","Data":"7acbb83e86d85cf73766a9a345682b00930217032221b84b4c3ceaa6ca15adc0"} Apr 20 13:36:37.113571 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:37.113508 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-p6m2w" podStartSLOduration=0.761097284 podStartE2EDuration="3.113490461s" podCreationTimestamp="2026-04-20 13:36:34 +0000 UTC" firstStartedPulling="2026-04-20 13:36:34.579927548 +0000 UTC m=+351.175618616" lastFinishedPulling="2026-04-20 13:36:36.932320739 +0000 UTC m=+353.528011793" observedRunningTime="2026-04-20 13:36:37.1112801 +0000 UTC m=+353.706971176" watchObservedRunningTime="2026-04-20 13:36:37.113490461 +0000 UTC m=+353.709181538" Apr 20 13:36:38.451005 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.450965 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6"] Apr 20 13:36:38.454537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.454514 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" Apr 20 13:36:38.456968 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.456946 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 13:36:38.457679 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.457661 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bsw7x\"" Apr 20 13:36:38.457776 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.457684 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 13:36:38.464024 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.464002 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6"] Apr 20 13:36:38.529528 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.529501 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5b2d015-4b34-4254-87a4-4d95f40d730d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6\" (UID: \"b5b2d015-4b34-4254-87a4-4d95f40d730d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" Apr 20 13:36:38.529687 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.529556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2sj9\" (UniqueName: \"kubernetes.io/projected/b5b2d015-4b34-4254-87a4-4d95f40d730d-kube-api-access-k2sj9\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6\" (UID: \"b5b2d015-4b34-4254-87a4-4d95f40d730d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" Apr 20 13:36:38.529687 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.529610 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5b2d015-4b34-4254-87a4-4d95f40d730d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6\" (UID: \"b5b2d015-4b34-4254-87a4-4d95f40d730d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" Apr 20 13:36:38.630394 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.630343 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5b2d015-4b34-4254-87a4-4d95f40d730d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6\" (UID: \"b5b2d015-4b34-4254-87a4-4d95f40d730d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" Apr 20 13:36:38.630609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.630415 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2sj9\" (UniqueName: \"kubernetes.io/projected/b5b2d015-4b34-4254-87a4-4d95f40d730d-kube-api-access-k2sj9\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6\" (UID: \"b5b2d015-4b34-4254-87a4-4d95f40d730d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" Apr 20 13:36:38.630609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.630433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5b2d015-4b34-4254-87a4-4d95f40d730d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6\" (UID: \"b5b2d015-4b34-4254-87a4-4d95f40d730d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" Apr 20 13:36:38.630770 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.630756 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5b2d015-4b34-4254-87a4-4d95f40d730d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6\" (UID: \"b5b2d015-4b34-4254-87a4-4d95f40d730d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" Apr 20 13:36:38.630805 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.630757 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5b2d015-4b34-4254-87a4-4d95f40d730d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6\" (UID: \"b5b2d015-4b34-4254-87a4-4d95f40d730d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" Apr 20 13:36:38.641180 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.641155 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2sj9\" (UniqueName: \"kubernetes.io/projected/b5b2d015-4b34-4254-87a4-4d95f40d730d-kube-api-access-k2sj9\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6\" (UID: \"b5b2d015-4b34-4254-87a4-4d95f40d730d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" Apr 20 13:36:38.763136 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.763034 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" Apr 20 13:36:38.881570 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:38.881539 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6"] Apr 20 13:36:38.885224 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:36:38.885189 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5b2d015_4b34_4254_87a4_4d95f40d730d.slice/crio-1f1d4b6464ec61ff54be7259387798be263ba62488101ee6d842d7ac30a90a42 WatchSource:0}: Error finding container 1f1d4b6464ec61ff54be7259387798be263ba62488101ee6d842d7ac30a90a42: Status 404 returned error can't find the container with id 1f1d4b6464ec61ff54be7259387798be263ba62488101ee6d842d7ac30a90a42 Apr 20 13:36:39.096696 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:39.096658 2573 generic.go:358] "Generic (PLEG): container finished" podID="b5b2d015-4b34-4254-87a4-4d95f40d730d" containerID="b32b2643b544af0510312dd6a806668d38ebca30049f2dc4d22862eeb77ce7c7" exitCode=0 Apr 20 13:36:39.096866 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:39.096741 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" event={"ID":"b5b2d015-4b34-4254-87a4-4d95f40d730d","Type":"ContainerDied","Data":"b32b2643b544af0510312dd6a806668d38ebca30049f2dc4d22862eeb77ce7c7"} Apr 20 13:36:39.096866 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:39.096767 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" event={"ID":"b5b2d015-4b34-4254-87a4-4d95f40d730d","Type":"ContainerStarted","Data":"1f1d4b6464ec61ff54be7259387798be263ba62488101ee6d842d7ac30a90a42"} Apr 20 13:36:42.108498 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.108464 2573 generic.go:358] "Generic (PLEG): container finished" podID="b5b2d015-4b34-4254-87a4-4d95f40d730d" containerID="2da2de8cd5c7fa5ca6c87b837cbd1bd7f2056e7b4378f4010220a4718f1c67d9" exitCode=0 Apr 20 13:36:42.108907 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.108541 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" event={"ID":"b5b2d015-4b34-4254-87a4-4d95f40d730d","Type":"ContainerDied","Data":"2da2de8cd5c7fa5ca6c87b837cbd1bd7f2056e7b4378f4010220a4718f1c67d9"} Apr 20 13:36:42.499302 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.499224 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-zh4p2"] Apr 20 13:36:42.502300 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.502284 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-zh4p2" Apr 20 13:36:42.504885 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.504863 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 13:36:42.504979 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.504955 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 13:36:42.505053 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.505032 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-6z5lm\"" Apr 20 13:36:42.510208 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.510186 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-zh4p2"] Apr 20 13:36:42.558865 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.558837 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0cc2620f-48d5-4585-9b83-4ecb176b9eec-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-zh4p2\" (UID: \"0cc2620f-48d5-4585-9b83-4ecb176b9eec\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-zh4p2" Apr 20 13:36:42.558865 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.558864 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw9kd\" (UniqueName: \"kubernetes.io/projected/0cc2620f-48d5-4585-9b83-4ecb176b9eec-kube-api-access-xw9kd\") pod \"cert-manager-cainjector-8966b78d4-zh4p2\" (UID: \"0cc2620f-48d5-4585-9b83-4ecb176b9eec\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-zh4p2" Apr 20 13:36:42.660251 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.660213 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0cc2620f-48d5-4585-9b83-4ecb176b9eec-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-zh4p2\" (UID: \"0cc2620f-48d5-4585-9b83-4ecb176b9eec\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-zh4p2" Apr 20 13:36:42.660427 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.660254 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xw9kd\" (UniqueName: \"kubernetes.io/projected/0cc2620f-48d5-4585-9b83-4ecb176b9eec-kube-api-access-xw9kd\") pod \"cert-manager-cainjector-8966b78d4-zh4p2\" (UID: \"0cc2620f-48d5-4585-9b83-4ecb176b9eec\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-zh4p2" Apr 20 13:36:42.669656 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.669626 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw9kd\" (UniqueName: \"kubernetes.io/projected/0cc2620f-48d5-4585-9b83-4ecb176b9eec-kube-api-access-xw9kd\") pod \"cert-manager-cainjector-8966b78d4-zh4p2\" (UID: \"0cc2620f-48d5-4585-9b83-4ecb176b9eec\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-zh4p2" Apr 20 13:36:42.669860 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.669839 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0cc2620f-48d5-4585-9b83-4ecb176b9eec-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-zh4p2\" (UID: \"0cc2620f-48d5-4585-9b83-4ecb176b9eec\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-zh4p2" Apr 20 13:36:42.811588 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.811557 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-zh4p2" Apr 20 13:36:42.935579 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:42.935542 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-zh4p2"] Apr 20 13:36:42.938257 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:36:42.938230 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cc2620f_48d5_4585_9b83_4ecb176b9eec.slice/crio-b2fd88d272730f5748a4d8fa2f501ac44bcdf43a557c5d9e035532ff25bfe684 WatchSource:0}: Error finding container b2fd88d272730f5748a4d8fa2f501ac44bcdf43a557c5d9e035532ff25bfe684: Status 404 returned error can't find the container with id b2fd88d272730f5748a4d8fa2f501ac44bcdf43a557c5d9e035532ff25bfe684 Apr 20 13:36:43.112369 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:43.112273 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-zh4p2" event={"ID":"0cc2620f-48d5-4585-9b83-4ecb176b9eec","Type":"ContainerStarted","Data":"b2fd88d272730f5748a4d8fa2f501ac44bcdf43a557c5d9e035532ff25bfe684"} Apr 20 13:36:43.114043 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:43.114021 2573 generic.go:358] "Generic (PLEG): container finished" podID="b5b2d015-4b34-4254-87a4-4d95f40d730d" containerID="3e489aa02161c29c1e95ba4465b7511625a6e1c2911ab8c453f231829b2331b7" exitCode=0 Apr 20 13:36:43.114170 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:43.114073 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" event={"ID":"b5b2d015-4b34-4254-87a4-4d95f40d730d","Type":"ContainerDied","Data":"3e489aa02161c29c1e95ba4465b7511625a6e1c2911ab8c453f231829b2331b7"} Apr 20 13:36:44.254481 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:44.254449 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" Apr 20 13:36:44.372446 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:44.372371 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2sj9\" (UniqueName: \"kubernetes.io/projected/b5b2d015-4b34-4254-87a4-4d95f40d730d-kube-api-access-k2sj9\") pod \"b5b2d015-4b34-4254-87a4-4d95f40d730d\" (UID: \"b5b2d015-4b34-4254-87a4-4d95f40d730d\") " Apr 20 13:36:44.372446 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:44.372407 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5b2d015-4b34-4254-87a4-4d95f40d730d-util\") pod \"b5b2d015-4b34-4254-87a4-4d95f40d730d\" (UID: \"b5b2d015-4b34-4254-87a4-4d95f40d730d\") " Apr 20 13:36:44.372446 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:44.372440 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5b2d015-4b34-4254-87a4-4d95f40d730d-bundle\") pod \"b5b2d015-4b34-4254-87a4-4d95f40d730d\" (UID: \"b5b2d015-4b34-4254-87a4-4d95f40d730d\") " Apr 20 13:36:44.372873 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:44.372832 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b2d015-4b34-4254-87a4-4d95f40d730d-bundle" (OuterVolumeSpecName: "bundle") pod "b5b2d015-4b34-4254-87a4-4d95f40d730d" (UID: "b5b2d015-4b34-4254-87a4-4d95f40d730d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:36:44.374826 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:44.374791 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b2d015-4b34-4254-87a4-4d95f40d730d-kube-api-access-k2sj9" (OuterVolumeSpecName: "kube-api-access-k2sj9") pod "b5b2d015-4b34-4254-87a4-4d95f40d730d" (UID: "b5b2d015-4b34-4254-87a4-4d95f40d730d"). InnerVolumeSpecName "kube-api-access-k2sj9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:36:44.378208 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:44.378156 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b2d015-4b34-4254-87a4-4d95f40d730d-util" (OuterVolumeSpecName: "util") pod "b5b2d015-4b34-4254-87a4-4d95f40d730d" (UID: "b5b2d015-4b34-4254-87a4-4d95f40d730d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:36:44.473519 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:44.473479 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k2sj9\" (UniqueName: \"kubernetes.io/projected/b5b2d015-4b34-4254-87a4-4d95f40d730d-kube-api-access-k2sj9\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:36:44.473519 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:44.473512 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5b2d015-4b34-4254-87a4-4d95f40d730d-util\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:36:44.473519 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:44.473527 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5b2d015-4b34-4254-87a4-4d95f40d730d-bundle\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:36:45.123521 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:45.123476 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" event={"ID":"b5b2d015-4b34-4254-87a4-4d95f40d730d","Type":"ContainerDied","Data":"1f1d4b6464ec61ff54be7259387798be263ba62488101ee6d842d7ac30a90a42"} Apr 20 13:36:45.123721 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:45.123529 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f1d4b6464ec61ff54be7259387798be263ba62488101ee6d842d7ac30a90a42" Apr 20 13:36:45.123721 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:45.123497 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdlqf6" Apr 20 13:36:46.127334 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:46.127301 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-zh4p2" event={"ID":"0cc2620f-48d5-4585-9b83-4ecb176b9eec","Type":"ContainerStarted","Data":"e8089e0e81ba838eea367594291bb3e5c6dbf1e707ebfd256b9f259fb81d22ea"} Apr 20 13:36:46.143566 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:46.143463 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-zh4p2" podStartSLOduration=1.5262672849999999 podStartE2EDuration="4.143446674s" podCreationTimestamp="2026-04-20 13:36:42 +0000 UTC" firstStartedPulling="2026-04-20 13:36:42.940470992 +0000 UTC m=+359.536162046" lastFinishedPulling="2026-04-20 13:36:45.557650366 +0000 UTC m=+362.153341435" observedRunningTime="2026-04-20 13:36:46.143044471 +0000 UTC m=+362.738735547" watchObservedRunningTime="2026-04-20 13:36:46.143446674 +0000 UTC m=+362.739137752" Apr 20 13:36:51.650989 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.650950 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-bnwlr"] Apr 20 13:36:51.651481 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.651376 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b2d015-4b34-4254-87a4-4d95f40d730d" containerName="util" Apr 20 13:36:51.651481 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.651394 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b2d015-4b34-4254-87a4-4d95f40d730d" containerName="util" Apr 20 13:36:51.651481 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.651409 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b2d015-4b34-4254-87a4-4d95f40d730d" containerName="pull" Apr 20 13:36:51.651481 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.651416 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b2d015-4b34-4254-87a4-4d95f40d730d" containerName="pull" Apr 20 13:36:51.651481 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.651426 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b2d015-4b34-4254-87a4-4d95f40d730d" containerName="extract" Apr 20 13:36:51.651481 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.651434 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b2d015-4b34-4254-87a4-4d95f40d730d" containerName="extract" Apr 20 13:36:51.651763 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.651501 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b2d015-4b34-4254-87a4-4d95f40d730d" containerName="extract" Apr 20 13:36:51.654466 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.654448 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bnwlr" Apr 20 13:36:51.656868 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.656846 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 13:36:51.657365 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.657343 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 13:36:51.657779 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.657765 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-k747w\"" Apr 20 13:36:51.661371 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.661351 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-bnwlr"] Apr 20 13:36:51.727971 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.727932 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50e2cde6-7239-4b66-b403-a8ba79225068-tmp\") pod \"openshift-lws-operator-bfc7f696d-bnwlr\" (UID: \"50e2cde6-7239-4b66-b403-a8ba79225068\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bnwlr" Apr 20 13:36:51.728189 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.727982 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b9sz\" (UniqueName: \"kubernetes.io/projected/50e2cde6-7239-4b66-b403-a8ba79225068-kube-api-access-5b9sz\") pod \"openshift-lws-operator-bfc7f696d-bnwlr\" (UID: \"50e2cde6-7239-4b66-b403-a8ba79225068\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bnwlr" Apr 20 13:36:51.829078 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.829045 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50e2cde6-7239-4b66-b403-a8ba79225068-tmp\") pod \"openshift-lws-operator-bfc7f696d-bnwlr\" (UID: \"50e2cde6-7239-4b66-b403-a8ba79225068\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bnwlr" Apr 20 13:36:51.829228 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.829090 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5b9sz\" (UniqueName: \"kubernetes.io/projected/50e2cde6-7239-4b66-b403-a8ba79225068-kube-api-access-5b9sz\") pod \"openshift-lws-operator-bfc7f696d-bnwlr\" (UID: \"50e2cde6-7239-4b66-b403-a8ba79225068\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bnwlr" Apr 20 13:36:51.829465 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.829446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50e2cde6-7239-4b66-b403-a8ba79225068-tmp\") pod \"openshift-lws-operator-bfc7f696d-bnwlr\" (UID: \"50e2cde6-7239-4b66-b403-a8ba79225068\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bnwlr" Apr 20 13:36:51.837725 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.837701 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b9sz\" (UniqueName: \"kubernetes.io/projected/50e2cde6-7239-4b66-b403-a8ba79225068-kube-api-access-5b9sz\") pod \"openshift-lws-operator-bfc7f696d-bnwlr\" (UID: \"50e2cde6-7239-4b66-b403-a8ba79225068\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bnwlr" Apr 20 13:36:51.964087 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:51.963984 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bnwlr" Apr 20 13:36:52.085734 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:52.085611 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-bnwlr"] Apr 20 13:36:52.088390 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:36:52.088360 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50e2cde6_7239_4b66_b403_a8ba79225068.slice/crio-9cc3eb0ddd4fe1a63b01b97298b0f2e56f094fcb43b62fb3ce21e9c4484b997a WatchSource:0}: Error finding container 9cc3eb0ddd4fe1a63b01b97298b0f2e56f094fcb43b62fb3ce21e9c4484b997a: Status 404 returned error can't find the container with id 9cc3eb0ddd4fe1a63b01b97298b0f2e56f094fcb43b62fb3ce21e9c4484b997a Apr 20 13:36:52.149725 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:52.149680 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bnwlr" event={"ID":"50e2cde6-7239-4b66-b403-a8ba79225068","Type":"ContainerStarted","Data":"9cc3eb0ddd4fe1a63b01b97298b0f2e56f094fcb43b62fb3ce21e9c4484b997a"} Apr 20 13:36:55.161475 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:55.161440 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bnwlr" event={"ID":"50e2cde6-7239-4b66-b403-a8ba79225068","Type":"ContainerStarted","Data":"e6274dc68a772d27590be41f347bfba414c836a47a1e8ea7472751117440b92d"} Apr 20 13:36:55.177571 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:55.177510 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bnwlr" podStartSLOduration=1.952869199 podStartE2EDuration="4.177491601s" podCreationTimestamp="2026-04-20 13:36:51 +0000 UTC" firstStartedPulling="2026-04-20 13:36:52.08991303 +0000 UTC m=+368.685604085" lastFinishedPulling="2026-04-20 13:36:54.31453543 +0000 UTC m=+370.910226487" observedRunningTime="2026-04-20 13:36:55.176884207 +0000 UTC m=+371.772575282" watchObservedRunningTime="2026-04-20 13:36:55.177491601 +0000 UTC m=+371.773182678" Apr 20 13:36:58.613475 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.613445 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx"] Apr 20 13:36:58.616830 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.616815 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" Apr 20 13:36:58.619426 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.619404 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 13:36:58.619520 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.619404 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 13:36:58.620233 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.620219 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bsw7x\"" Apr 20 13:36:58.625225 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.625201 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx"] Apr 20 13:36:58.684534 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.684498 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx\" (UID: \"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" Apr 20 13:36:58.684709 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.684539 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnmg5\" (UniqueName: \"kubernetes.io/projected/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-kube-api-access-tnmg5\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx\" (UID: \"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" Apr 20 13:36:58.684709 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.684578 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx\" (UID: \"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" Apr 20 13:36:58.785823 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.785779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx\" (UID: \"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" Apr 20 13:36:58.785823 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.785827 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnmg5\" (UniqueName: \"kubernetes.io/projected/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-kube-api-access-tnmg5\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx\" (UID: \"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" Apr 20 13:36:58.786069 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.785858 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx\" (UID: \"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" Apr 20 13:36:58.786273 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.786253 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx\" (UID: \"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" Apr 20 13:36:58.786366 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.786301 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx\" (UID: \"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" Apr 20 13:36:58.794917 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.794897 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnmg5\" (UniqueName: \"kubernetes.io/projected/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-kube-api-access-tnmg5\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx\" (UID: \"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" Apr 20 13:36:58.926530 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:58.926453 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" Apr 20 13:36:59.045821 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:59.045797 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx"] Apr 20 13:36:59.047333 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:36:59.047306 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode35bc3c3_f883_4b7e_b0ee_1f59a7d0b800.slice/crio-106c07576b385dab2823ec32b440a405b22a8b02a249c6674af71274b26c48cc WatchSource:0}: Error finding container 106c07576b385dab2823ec32b440a405b22a8b02a249c6674af71274b26c48cc: Status 404 returned error can't find the container with id 106c07576b385dab2823ec32b440a405b22a8b02a249c6674af71274b26c48cc Apr 20 13:36:59.175010 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:59.174966 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" event={"ID":"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800","Type":"ContainerStarted","Data":"d6241ff58ca279dafffe5ad98363903a2055167f1a466b2bd841ce0368142633"} Apr 20 13:36:59.175010 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:36:59.174999 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" event={"ID":"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800","Type":"ContainerStarted","Data":"106c07576b385dab2823ec32b440a405b22a8b02a249c6674af71274b26c48cc"} Apr 20 13:37:00.179658 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:00.179573 2573 generic.go:358] "Generic (PLEG): container finished" podID="e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800" containerID="d6241ff58ca279dafffe5ad98363903a2055167f1a466b2bd841ce0368142633" exitCode=0 Apr 20 13:37:00.180029 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:00.179667 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" event={"ID":"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800","Type":"ContainerDied","Data":"d6241ff58ca279dafffe5ad98363903a2055167f1a466b2bd841ce0368142633"} Apr 20 13:37:01.184311 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:01.184272 2573 generic.go:358] "Generic (PLEG): container finished" podID="e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800" containerID="7319e6c10e92185581f1344ff680c7aca5758b0365176edafae371323ebdafe8" exitCode=0 Apr 20 13:37:01.184709 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:01.184349 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" event={"ID":"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800","Type":"ContainerDied","Data":"7319e6c10e92185581f1344ff680c7aca5758b0365176edafae371323ebdafe8"} Apr 20 13:37:02.189407 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:02.189372 2573 generic.go:358] "Generic (PLEG): container finished" podID="e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800" containerID="2142f0abe9f8093c70fdbcb5eb1addeb0b00474aea11e1de06fc799e115583e5" exitCode=0 Apr 20 13:37:02.189797 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:02.189451 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" event={"ID":"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800","Type":"ContainerDied","Data":"2142f0abe9f8093c70fdbcb5eb1addeb0b00474aea11e1de06fc799e115583e5"} Apr 20 13:37:03.309730 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:03.309709 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" Apr 20 13:37:03.427994 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:03.427957 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-util\") pod \"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800\" (UID: \"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800\") " Apr 20 13:37:03.427994 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:03.428000 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnmg5\" (UniqueName: \"kubernetes.io/projected/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-kube-api-access-tnmg5\") pod \"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800\" (UID: \"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800\") " Apr 20 13:37:03.428225 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:03.428046 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-bundle\") pod \"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800\" (UID: \"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800\") " Apr 20 13:37:03.428775 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:03.428727 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-bundle" (OuterVolumeSpecName: "bundle") pod "e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800" (UID: "e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:37:03.430173 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:03.430119 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-kube-api-access-tnmg5" (OuterVolumeSpecName: "kube-api-access-tnmg5") pod "e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800" (UID: "e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800"). InnerVolumeSpecName "kube-api-access-tnmg5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:37:03.433433 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:03.433409 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-util" (OuterVolumeSpecName: "util") pod "e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800" (UID: "e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:37:03.528954 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:03.528873 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-util\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:37:03.528954 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:03.528901 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tnmg5\" (UniqueName: \"kubernetes.io/projected/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-kube-api-access-tnmg5\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:37:03.528954 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:03.528911 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800-bundle\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:37:04.197440 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:04.197408 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" event={"ID":"e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800","Type":"ContainerDied","Data":"106c07576b385dab2823ec32b440a405b22a8b02a249c6674af71274b26c48cc"} Apr 20 13:37:04.197440 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:04.197441 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="106c07576b385dab2823ec32b440a405b22a8b02a249c6674af71274b26c48cc" Apr 20 13:37:04.197628 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:04.197421 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58cfjx" Apr 20 13:37:08.838892 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.838855 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9"] Apr 20 13:37:08.839580 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.839185 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800" containerName="pull" Apr 20 13:37:08.839580 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.839197 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800" containerName="pull" Apr 20 13:37:08.839580 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.839211 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800" containerName="util" Apr 20 13:37:08.839580 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.839217 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800" containerName="util" Apr 20 13:37:08.839580 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.839223 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800" containerName="extract" Apr 20 13:37:08.839580 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.839229 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800" containerName="extract" Apr 20 13:37:08.839580 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.839275 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e35bc3c3-f883-4b7e-b0ee-1f59a7d0b800" containerName="extract" Apr 20 13:37:08.843510 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.843495 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" Apr 20 13:37:08.846561 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.846529 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 13:37:08.846679 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.846582 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bsw7x\"" Apr 20 13:37:08.847435 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.847419 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 13:37:08.853191 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.853167 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9"] Apr 20 13:37:08.973224 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.973192 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9\" (UID: \"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" Apr 20 13:37:08.973224 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.973226 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9\" (UID: \"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" Apr 20 13:37:08.973441 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:08.973255 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxj84\" (UniqueName: \"kubernetes.io/projected/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-kube-api-access-zxj84\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9\" (UID: \"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" Apr 20 13:37:09.073686 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:09.073651 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9\" (UID: \"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" Apr 20 13:37:09.073686 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:09.073691 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxj84\" (UniqueName: \"kubernetes.io/projected/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-kube-api-access-zxj84\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9\" (UID: \"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" Apr 20 13:37:09.073948 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:09.073773 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9\" (UID: \"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" Apr 20 13:37:09.074113 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:09.074091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9\" (UID: \"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" Apr 20 13:37:09.074214 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:09.074099 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9\" (UID: \"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" Apr 20 13:37:09.082569 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:09.082546 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxj84\" (UniqueName: \"kubernetes.io/projected/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-kube-api-access-zxj84\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9\" (UID: \"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" Apr 20 13:37:09.153481 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:09.153413 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" Apr 20 13:37:09.288422 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:09.288389 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9"] Apr 20 13:37:09.291194 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:37:09.291165 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84c659b4_c3f8_4930_9b1d_31a6e8f2e6eb.slice/crio-1fd074080e4583d17925657960b59707e83a5e359ccf9ad270dea40bb9dc7a77 WatchSource:0}: Error finding container 1fd074080e4583d17925657960b59707e83a5e359ccf9ad270dea40bb9dc7a77: Status 404 returned error can't find the container with id 1fd074080e4583d17925657960b59707e83a5e359ccf9ad270dea40bb9dc7a77 Apr 20 13:37:10.218600 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.218568 2573 generic.go:358] "Generic (PLEG): container finished" podID="84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb" containerID="c015904d351233f86ddf8873d7d86e0a1ec2856854b2657351688c655125ef2b" exitCode=0 Apr 20 13:37:10.218955 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.218636 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" event={"ID":"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb","Type":"ContainerDied","Data":"c015904d351233f86ddf8873d7d86e0a1ec2856854b2657351688c655125ef2b"} Apr 20 13:37:10.218955 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.218660 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" event={"ID":"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb","Type":"ContainerStarted","Data":"1fd074080e4583d17925657960b59707e83a5e359ccf9ad270dea40bb9dc7a77"} Apr 20 13:37:10.292577 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.292550 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf"] Apr 20 13:37:10.296838 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.296823 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" Apr 20 13:37:10.299112 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.299088 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 13:37:10.299264 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.299249 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 13:37:10.299331 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.299320 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 13:37:10.299397 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.299378 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-zmf5m\"" Apr 20 13:37:10.299455 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.299434 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 13:37:10.310036 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.310015 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf"] Apr 20 13:37:10.385849 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.385772 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10cde495-20e7-4979-9d71-c3cf9f4b00f3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-zqkmf\" (UID: \"10cde495-20e7-4979-9d71-c3cf9f4b00f3\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" Apr 20 13:37:10.385849 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.385811 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10cde495-20e7-4979-9d71-c3cf9f4b00f3-webhook-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-zqkmf\" (UID: \"10cde495-20e7-4979-9d71-c3cf9f4b00f3\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" Apr 20 13:37:10.385849 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.385842 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr76r\" (UniqueName: \"kubernetes.io/projected/10cde495-20e7-4979-9d71-c3cf9f4b00f3-kube-api-access-kr76r\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-zqkmf\" (UID: \"10cde495-20e7-4979-9d71-c3cf9f4b00f3\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" Apr 20 13:37:10.487107 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.487078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10cde495-20e7-4979-9d71-c3cf9f4b00f3-webhook-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-zqkmf\" (UID: \"10cde495-20e7-4979-9d71-c3cf9f4b00f3\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" Apr 20 13:37:10.487275 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.487124 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kr76r\" (UniqueName: \"kubernetes.io/projected/10cde495-20e7-4979-9d71-c3cf9f4b00f3-kube-api-access-kr76r\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-zqkmf\" (UID: \"10cde495-20e7-4979-9d71-c3cf9f4b00f3\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" Apr 20 13:37:10.487275 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.487198 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10cde495-20e7-4979-9d71-c3cf9f4b00f3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-zqkmf\" (UID: \"10cde495-20e7-4979-9d71-c3cf9f4b00f3\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" Apr 20 13:37:10.489495 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.489464 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10cde495-20e7-4979-9d71-c3cf9f4b00f3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-zqkmf\" (UID: \"10cde495-20e7-4979-9d71-c3cf9f4b00f3\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" Apr 20 13:37:10.489622 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.489504 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10cde495-20e7-4979-9d71-c3cf9f4b00f3-webhook-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-zqkmf\" (UID: \"10cde495-20e7-4979-9d71-c3cf9f4b00f3\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" Apr 20 13:37:10.498226 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.498200 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr76r\" (UniqueName: \"kubernetes.io/projected/10cde495-20e7-4979-9d71-c3cf9f4b00f3-kube-api-access-kr76r\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-zqkmf\" (UID: \"10cde495-20e7-4979-9d71-c3cf9f4b00f3\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" Apr 20 13:37:10.606487 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.606451 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" Apr 20 13:37:10.746863 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:10.746835 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf"] Apr 20 13:37:10.749086 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:37:10.749057 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10cde495_20e7_4979_9d71_c3cf9f4b00f3.slice/crio-7dca29f231facbe3baf19fac56d3c1d5fe8994867881b1c5f64a05facad88c60 WatchSource:0}: Error finding container 7dca29f231facbe3baf19fac56d3c1d5fe8994867881b1c5f64a05facad88c60: Status 404 returned error can't find the container with id 7dca29f231facbe3baf19fac56d3c1d5fe8994867881b1c5f64a05facad88c60 Apr 20 13:37:11.223056 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:11.222957 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" event={"ID":"10cde495-20e7-4979-9d71-c3cf9f4b00f3","Type":"ContainerStarted","Data":"7dca29f231facbe3baf19fac56d3c1d5fe8994867881b1c5f64a05facad88c60"} Apr 20 13:37:11.224549 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:11.224529 2573 generic.go:358] "Generic (PLEG): container finished" podID="84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb" containerID="b1bf8202a63074e055bd97149119a7ec29100aee39dc3f8849a253ab8d2813d6" exitCode=0 Apr 20 13:37:11.224682 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:11.224570 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" event={"ID":"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb","Type":"ContainerDied","Data":"b1bf8202a63074e055bd97149119a7ec29100aee39dc3f8849a253ab8d2813d6"} Apr 20 13:37:12.231329 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:12.231297 2573 generic.go:358] "Generic (PLEG): container finished" podID="84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb" containerID="f9ffb685fff01b07feff15a528b5b3455c113b80ccf131ee2930f7a06cd2261f" exitCode=0 Apr 20 13:37:12.231807 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:12.231390 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" event={"ID":"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb","Type":"ContainerDied","Data":"f9ffb685fff01b07feff15a528b5b3455c113b80ccf131ee2930f7a06cd2261f"} Apr 20 13:37:13.359926 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:13.359903 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" Apr 20 13:37:13.514227 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:13.514125 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-util\") pod \"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb\" (UID: \"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb\") " Apr 20 13:37:13.514227 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:13.514207 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-bundle\") pod \"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb\" (UID: \"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb\") " Apr 20 13:37:13.514447 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:13.514236 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxj84\" (UniqueName: \"kubernetes.io/projected/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-kube-api-access-zxj84\") pod \"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb\" (UID: \"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb\") " Apr 20 13:37:13.514950 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:13.514901 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-bundle" (OuterVolumeSpecName: "bundle") pod "84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb" (UID: "84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:37:13.516402 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:13.516370 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-kube-api-access-zxj84" (OuterVolumeSpecName: "kube-api-access-zxj84") pod "84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb" (UID: "84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb"). InnerVolumeSpecName "kube-api-access-zxj84". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:37:13.519650 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:13.519627 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-util" (OuterVolumeSpecName: "util") pod "84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb" (UID: "84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:37:13.615382 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:13.615345 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-bundle\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:37:13.615382 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:13.615377 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zxj84\" (UniqueName: \"kubernetes.io/projected/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-kube-api-access-zxj84\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:37:13.615382 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:13.615389 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb-util\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:37:14.239813 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:14.239721 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" event={"ID":"84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb","Type":"ContainerDied","Data":"1fd074080e4583d17925657960b59707e83a5e359ccf9ad270dea40bb9dc7a77"} Apr 20 13:37:14.239813 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:14.239743 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mvff9" Apr 20 13:37:14.239813 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:14.239762 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fd074080e4583d17925657960b59707e83a5e359ccf9ad270dea40bb9dc7a77" Apr 20 13:37:14.241304 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:14.241271 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" event={"ID":"10cde495-20e7-4979-9d71-c3cf9f4b00f3","Type":"ContainerStarted","Data":"00e49fdcac19b0e65177e413b070a8bd148e6cd8cac2946836f68e51542ea595"} Apr 20 13:37:14.241416 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:14.241385 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" Apr 20 13:37:14.265777 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:14.265733 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" podStartSLOduration=1.774865915 podStartE2EDuration="4.265719703s" podCreationTimestamp="2026-04-20 13:37:10 +0000 UTC" firstStartedPulling="2026-04-20 13:37:10.750842856 +0000 UTC m=+387.346533913" lastFinishedPulling="2026-04-20 13:37:13.241696645 +0000 UTC m=+389.837387701" observedRunningTime="2026-04-20 13:37:14.263481275 +0000 UTC m=+390.859172351" watchObservedRunningTime="2026-04-20 13:37:14.265719703 +0000 UTC m=+390.861410778" Apr 20 13:37:25.065650 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.065618 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks"] Apr 20 13:37:25.066112 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.066004 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb" containerName="pull" Apr 20 13:37:25.066112 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.066020 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb" containerName="pull" Apr 20 13:37:25.066112 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.066041 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb" containerName="extract" Apr 20 13:37:25.066112 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.066050 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb" containerName="extract" Apr 20 13:37:25.066112 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.066065 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb" containerName="util" Apr 20 13:37:25.066112 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.066073 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb" containerName="util" Apr 20 13:37:25.066439 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.066183 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="84c659b4-c3f8-4930-9b1d-31a6e8f2e6eb" containerName="extract" Apr 20 13:37:25.073323 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.073302 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:25.076896 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.076865 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 13:37:25.077037 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.076895 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 13:37:25.077037 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.076946 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-qfg5c\"" Apr 20 13:37:25.077037 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.076895 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 13:37:25.078756 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.078733 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks"] Apr 20 13:37:25.203888 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.203854 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/94bbdb99-f7ef-4514-82dd-4850b5c9ec5f-manager-config\") pod \"lws-controller-manager-59c6b8cc85-9dkks\" (UID: \"94bbdb99-f7ef-4514-82dd-4850b5c9ec5f\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:25.204074 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.203902 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/94bbdb99-f7ef-4514-82dd-4850b5c9ec5f-metrics-cert\") pod \"lws-controller-manager-59c6b8cc85-9dkks\" (UID: \"94bbdb99-f7ef-4514-82dd-4850b5c9ec5f\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:25.204074 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.203967 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94bbdb99-f7ef-4514-82dd-4850b5c9ec5f-cert\") pod \"lws-controller-manager-59c6b8cc85-9dkks\" (UID: \"94bbdb99-f7ef-4514-82dd-4850b5c9ec5f\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:25.204074 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.204061 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmggg\" (UniqueName: \"kubernetes.io/projected/94bbdb99-f7ef-4514-82dd-4850b5c9ec5f-kube-api-access-cmggg\") pod \"lws-controller-manager-59c6b8cc85-9dkks\" (UID: \"94bbdb99-f7ef-4514-82dd-4850b5c9ec5f\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:25.246660 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.246635 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-zqkmf" Apr 20 13:37:25.304792 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.304755 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmggg\" (UniqueName: \"kubernetes.io/projected/94bbdb99-f7ef-4514-82dd-4850b5c9ec5f-kube-api-access-cmggg\") pod \"lws-controller-manager-59c6b8cc85-9dkks\" (UID: \"94bbdb99-f7ef-4514-82dd-4850b5c9ec5f\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:25.304939 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.304801 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/94bbdb99-f7ef-4514-82dd-4850b5c9ec5f-manager-config\") pod \"lws-controller-manager-59c6b8cc85-9dkks\" (UID: \"94bbdb99-f7ef-4514-82dd-4850b5c9ec5f\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:25.304939 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.304838 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/94bbdb99-f7ef-4514-82dd-4850b5c9ec5f-metrics-cert\") pod \"lws-controller-manager-59c6b8cc85-9dkks\" (UID: \"94bbdb99-f7ef-4514-82dd-4850b5c9ec5f\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:25.304939 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.304877 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94bbdb99-f7ef-4514-82dd-4850b5c9ec5f-cert\") pod \"lws-controller-manager-59c6b8cc85-9dkks\" (UID: \"94bbdb99-f7ef-4514-82dd-4850b5c9ec5f\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:25.305656 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.305626 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/94bbdb99-f7ef-4514-82dd-4850b5c9ec5f-manager-config\") pod \"lws-controller-manager-59c6b8cc85-9dkks\" (UID: \"94bbdb99-f7ef-4514-82dd-4850b5c9ec5f\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:25.307315 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.307293 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94bbdb99-f7ef-4514-82dd-4850b5c9ec5f-cert\") pod \"lws-controller-manager-59c6b8cc85-9dkks\" (UID: \"94bbdb99-f7ef-4514-82dd-4850b5c9ec5f\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:25.307440 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.307422 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/94bbdb99-f7ef-4514-82dd-4850b5c9ec5f-metrics-cert\") pod \"lws-controller-manager-59c6b8cc85-9dkks\" (UID: \"94bbdb99-f7ef-4514-82dd-4850b5c9ec5f\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:25.315095 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.315068 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmggg\" (UniqueName: \"kubernetes.io/projected/94bbdb99-f7ef-4514-82dd-4850b5c9ec5f-kube-api-access-cmggg\") pod \"lws-controller-manager-59c6b8cc85-9dkks\" (UID: \"94bbdb99-f7ef-4514-82dd-4850b5c9ec5f\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:25.383448 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.383335 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:25.507521 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:25.507498 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks"] Apr 20 13:37:25.509953 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:37:25.509919 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94bbdb99_f7ef_4514_82dd_4850b5c9ec5f.slice/crio-d461ba213a770933bbb50c09ad4e6caff66a6c280ff81dca785c9ef927903e58 WatchSource:0}: Error finding container d461ba213a770933bbb50c09ad4e6caff66a6c280ff81dca785c9ef927903e58: Status 404 returned error can't find the container with id d461ba213a770933bbb50c09ad4e6caff66a6c280ff81dca785c9ef927903e58 Apr 20 13:37:26.280797 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:26.280762 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" event={"ID":"94bbdb99-f7ef-4514-82dd-4850b5c9ec5f","Type":"ContainerStarted","Data":"d461ba213a770933bbb50c09ad4e6caff66a6c280ff81dca785c9ef927903e58"} Apr 20 13:37:28.288407 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:28.288376 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" event={"ID":"94bbdb99-f7ef-4514-82dd-4850b5c9ec5f","Type":"ContainerStarted","Data":"9a18c5323fab615473f7b6aba3acc5758a6745b795ed40bfc263ded0c586b748"} Apr 20 13:37:28.288774 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:28.288527 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:28.329654 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:28.329614 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" podStartSLOduration=0.960727193 podStartE2EDuration="3.329600536s" podCreationTimestamp="2026-04-20 13:37:25 +0000 UTC" firstStartedPulling="2026-04-20 13:37:25.511826742 +0000 UTC m=+402.107517797" lastFinishedPulling="2026-04-20 13:37:27.880700081 +0000 UTC m=+404.476391140" observedRunningTime="2026-04-20 13:37:28.327555037 +0000 UTC m=+404.923246113" watchObservedRunningTime="2026-04-20 13:37:28.329600536 +0000 UTC m=+404.925291612" Apr 20 13:37:39.294238 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.294206 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-9dkks" Apr 20 13:37:39.344252 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.344225 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7"] Apr 20 13:37:39.346371 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.346349 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" Apr 20 13:37:39.348834 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.348814 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bsw7x\"" Apr 20 13:37:39.348943 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.348897 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 13:37:39.348943 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.348919 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 13:37:39.355831 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.355809 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7"] Apr 20 13:37:39.516599 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.516565 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7\" (UID: \"31bd7efa-55d1-4929-9cdd-4c77c1ae93df\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" Apr 20 13:37:39.516872 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.516688 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7\" (UID: \"31bd7efa-55d1-4929-9cdd-4c77c1ae93df\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" Apr 20 13:37:39.516872 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.516772 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf2fc\" (UniqueName: \"kubernetes.io/projected/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-kube-api-access-xf2fc\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7\" (UID: \"31bd7efa-55d1-4929-9cdd-4c77c1ae93df\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" Apr 20 13:37:39.617341 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.617263 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7\" (UID: \"31bd7efa-55d1-4929-9cdd-4c77c1ae93df\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" Apr 20 13:37:39.617341 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.617309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xf2fc\" (UniqueName: \"kubernetes.io/projected/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-kube-api-access-xf2fc\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7\" (UID: \"31bd7efa-55d1-4929-9cdd-4c77c1ae93df\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" Apr 20 13:37:39.617511 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.617344 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7\" (UID: \"31bd7efa-55d1-4929-9cdd-4c77c1ae93df\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" Apr 20 13:37:39.617636 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.617616 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7\" (UID: \"31bd7efa-55d1-4929-9cdd-4c77c1ae93df\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" Apr 20 13:37:39.617674 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.617663 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7\" (UID: \"31bd7efa-55d1-4929-9cdd-4c77c1ae93df\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" Apr 20 13:37:39.628760 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.628732 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf2fc\" (UniqueName: \"kubernetes.io/projected/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-kube-api-access-xf2fc\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7\" (UID: \"31bd7efa-55d1-4929-9cdd-4c77c1ae93df\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" Apr 20 13:37:39.657024 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.657004 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" Apr 20 13:37:39.782780 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:39.782757 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7"] Apr 20 13:37:39.784821 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:37:39.784798 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31bd7efa_55d1_4929_9cdd_4c77c1ae93df.slice/crio-632d2d504d8e85467ad885e9d59a1e203aeb848bb675a02b7273fbd5f377ab9e WatchSource:0}: Error finding container 632d2d504d8e85467ad885e9d59a1e203aeb848bb675a02b7273fbd5f377ab9e: Status 404 returned error can't find the container with id 632d2d504d8e85467ad885e9d59a1e203aeb848bb675a02b7273fbd5f377ab9e Apr 20 13:37:40.335100 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:40.335068 2573 generic.go:358] "Generic (PLEG): container finished" podID="31bd7efa-55d1-4929-9cdd-4c77c1ae93df" containerID="83797848a4f992aab155a3beedbdd100ef4d387cea723ed10ef67460cb737698" exitCode=0 Apr 20 13:37:40.335474 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:40.335168 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" event={"ID":"31bd7efa-55d1-4929-9cdd-4c77c1ae93df","Type":"ContainerDied","Data":"83797848a4f992aab155a3beedbdd100ef4d387cea723ed10ef67460cb737698"} Apr 20 13:37:40.335474 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:40.335204 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" event={"ID":"31bd7efa-55d1-4929-9cdd-4c77c1ae93df","Type":"ContainerStarted","Data":"632d2d504d8e85467ad885e9d59a1e203aeb848bb675a02b7273fbd5f377ab9e"} Apr 20 13:37:41.340136 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:41.340103 2573 generic.go:358] "Generic (PLEG): container finished" podID="31bd7efa-55d1-4929-9cdd-4c77c1ae93df" containerID="dec2fe20eaa0b395a6348b0db6db1582eb24594ec5ff2fd0a38aef6224663140" exitCode=0 Apr 20 13:37:41.340597 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:41.340225 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" event={"ID":"31bd7efa-55d1-4929-9cdd-4c77c1ae93df","Type":"ContainerDied","Data":"dec2fe20eaa0b395a6348b0db6db1582eb24594ec5ff2fd0a38aef6224663140"} Apr 20 13:37:42.345287 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:42.345254 2573 generic.go:358] "Generic (PLEG): container finished" podID="31bd7efa-55d1-4929-9cdd-4c77c1ae93df" containerID="b223216e4dc76d380e6d3e346eb39fd5162a2812458f065926cae734797ad13b" exitCode=0 Apr 20 13:37:42.345677 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:42.345323 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" event={"ID":"31bd7efa-55d1-4929-9cdd-4c77c1ae93df","Type":"ContainerDied","Data":"b223216e4dc76d380e6d3e346eb39fd5162a2812458f065926cae734797ad13b"} Apr 20 13:37:43.482185 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:43.482163 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" Apr 20 13:37:43.543569 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:43.543540 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-bundle\") pod \"31bd7efa-55d1-4929-9cdd-4c77c1ae93df\" (UID: \"31bd7efa-55d1-4929-9cdd-4c77c1ae93df\") " Apr 20 13:37:43.543728 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:43.543587 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf2fc\" (UniqueName: \"kubernetes.io/projected/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-kube-api-access-xf2fc\") pod \"31bd7efa-55d1-4929-9cdd-4c77c1ae93df\" (UID: \"31bd7efa-55d1-4929-9cdd-4c77c1ae93df\") " Apr 20 13:37:43.543728 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:43.543621 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-util\") pod \"31bd7efa-55d1-4929-9cdd-4c77c1ae93df\" (UID: \"31bd7efa-55d1-4929-9cdd-4c77c1ae93df\") " Apr 20 13:37:43.544808 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:43.544769 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-bundle" (OuterVolumeSpecName: "bundle") pod "31bd7efa-55d1-4929-9cdd-4c77c1ae93df" (UID: "31bd7efa-55d1-4929-9cdd-4c77c1ae93df"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:37:43.545755 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:43.545726 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-kube-api-access-xf2fc" (OuterVolumeSpecName: "kube-api-access-xf2fc") pod "31bd7efa-55d1-4929-9cdd-4c77c1ae93df" (UID: "31bd7efa-55d1-4929-9cdd-4c77c1ae93df"). InnerVolumeSpecName "kube-api-access-xf2fc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:37:43.552192 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:43.552133 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-util" (OuterVolumeSpecName: "util") pod "31bd7efa-55d1-4929-9cdd-4c77c1ae93df" (UID: "31bd7efa-55d1-4929-9cdd-4c77c1ae93df"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:37:43.644613 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:43.644544 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xf2fc\" (UniqueName: \"kubernetes.io/projected/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-kube-api-access-xf2fc\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:37:43.644613 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:43.644572 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-util\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:37:43.644613 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:43.644581 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/31bd7efa-55d1-4929-9cdd-4c77c1ae93df-bundle\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:37:44.353610 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:44.353579 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" event={"ID":"31bd7efa-55d1-4929-9cdd-4c77c1ae93df","Type":"ContainerDied","Data":"632d2d504d8e85467ad885e9d59a1e203aeb848bb675a02b7273fbd5f377ab9e"} Apr 20 13:37:44.353610 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:44.353615 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="632d2d504d8e85467ad885e9d59a1e203aeb848bb675a02b7273fbd5f377ab9e" Apr 20 13:37:44.353827 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:44.353587 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x2wd7" Apr 20 13:37:53.121016 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.120974 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv"] Apr 20 13:37:53.121519 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.121422 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31bd7efa-55d1-4929-9cdd-4c77c1ae93df" containerName="util" Apr 20 13:37:53.121519 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.121441 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bd7efa-55d1-4929-9cdd-4c77c1ae93df" containerName="util" Apr 20 13:37:53.121519 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.121455 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31bd7efa-55d1-4929-9cdd-4c77c1ae93df" containerName="pull" Apr 20 13:37:53.121519 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.121463 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bd7efa-55d1-4929-9cdd-4c77c1ae93df" containerName="pull" Apr 20 13:37:53.121519 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.121478 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31bd7efa-55d1-4929-9cdd-4c77c1ae93df" containerName="extract" Apr 20 13:37:53.121519 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.121486 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bd7efa-55d1-4929-9cdd-4c77c1ae93df" containerName="extract" Apr 20 13:37:53.121823 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.121571 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="31bd7efa-55d1-4929-9cdd-4c77c1ae93df" containerName="extract" Apr 20 13:37:53.127236 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.127211 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" Apr 20 13:37:53.130588 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.130566 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 13:37:53.130588 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.130581 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 13:37:53.131251 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.131235 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bsw7x\"" Apr 20 13:37:53.134530 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.134503 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv"] Apr 20 13:37:53.218105 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.218066 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9df4f7c-6624-41c4-99f7-49d53b39e88c-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv\" (UID: \"b9df4f7c-6624-41c4-99f7-49d53b39e88c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" Apr 20 13:37:53.218105 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.218105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9df4f7c-6624-41c4-99f7-49d53b39e88c-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv\" (UID: \"b9df4f7c-6624-41c4-99f7-49d53b39e88c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" Apr 20 13:37:53.218344 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.218167 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp6jw\" (UniqueName: \"kubernetes.io/projected/b9df4f7c-6624-41c4-99f7-49d53b39e88c-kube-api-access-sp6jw\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv\" (UID: \"b9df4f7c-6624-41c4-99f7-49d53b39e88c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" Apr 20 13:37:53.319603 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.319560 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp6jw\" (UniqueName: \"kubernetes.io/projected/b9df4f7c-6624-41c4-99f7-49d53b39e88c-kube-api-access-sp6jw\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv\" (UID: \"b9df4f7c-6624-41c4-99f7-49d53b39e88c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" Apr 20 13:37:53.319803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.319655 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9df4f7c-6624-41c4-99f7-49d53b39e88c-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv\" (UID: \"b9df4f7c-6624-41c4-99f7-49d53b39e88c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" Apr 20 13:37:53.319803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.319680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9df4f7c-6624-41c4-99f7-49d53b39e88c-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv\" (UID: \"b9df4f7c-6624-41c4-99f7-49d53b39e88c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" Apr 20 13:37:53.320054 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.320034 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9df4f7c-6624-41c4-99f7-49d53b39e88c-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv\" (UID: \"b9df4f7c-6624-41c4-99f7-49d53b39e88c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" Apr 20 13:37:53.320120 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.320068 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9df4f7c-6624-41c4-99f7-49d53b39e88c-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv\" (UID: \"b9df4f7c-6624-41c4-99f7-49d53b39e88c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" Apr 20 13:37:53.329602 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.329575 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp6jw\" (UniqueName: \"kubernetes.io/projected/b9df4f7c-6624-41c4-99f7-49d53b39e88c-kube-api-access-sp6jw\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv\" (UID: \"b9df4f7c-6624-41c4-99f7-49d53b39e88c\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" Apr 20 13:37:53.437258 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.437171 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" Apr 20 13:37:53.579812 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:53.579787 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv"] Apr 20 13:37:53.581776 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:37:53.581745 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9df4f7c_6624_41c4_99f7_49d53b39e88c.slice/crio-c95950c2f01d34dae14072a9a853bd6464d124f2baafe2702f58080a4ca11f43 WatchSource:0}: Error finding container c95950c2f01d34dae14072a9a853bd6464d124f2baafe2702f58080a4ca11f43: Status 404 returned error can't find the container with id c95950c2f01d34dae14072a9a853bd6464d124f2baafe2702f58080a4ca11f43 Apr 20 13:37:54.390887 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:54.390855 2573 generic.go:358] "Generic (PLEG): container finished" podID="b9df4f7c-6624-41c4-99f7-49d53b39e88c" containerID="bdf3f85113d12fb1dc4342f72a0b3b06bd47ce9b417893db8fa4572e4366f888" exitCode=0 Apr 20 13:37:54.391244 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:54.390946 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" event={"ID":"b9df4f7c-6624-41c4-99f7-49d53b39e88c","Type":"ContainerDied","Data":"bdf3f85113d12fb1dc4342f72a0b3b06bd47ce9b417893db8fa4572e4366f888"} Apr 20 13:37:54.391244 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:54.390978 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" event={"ID":"b9df4f7c-6624-41c4-99f7-49d53b39e88c","Type":"ContainerStarted","Data":"c95950c2f01d34dae14072a9a853bd6464d124f2baafe2702f58080a4ca11f43"} Apr 20 13:37:55.396044 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:55.396013 2573 generic.go:358] "Generic (PLEG): container finished" podID="b9df4f7c-6624-41c4-99f7-49d53b39e88c" containerID="bc87f77abcd219d83c363aefc1a377fbf2d9b282de1a44125cd7039836c2491b" exitCode=0 Apr 20 13:37:55.396430 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:55.396050 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" event={"ID":"b9df4f7c-6624-41c4-99f7-49d53b39e88c","Type":"ContainerDied","Data":"bc87f77abcd219d83c363aefc1a377fbf2d9b282de1a44125cd7039836c2491b"} Apr 20 13:37:56.401593 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:56.401552 2573 generic.go:358] "Generic (PLEG): container finished" podID="b9df4f7c-6624-41c4-99f7-49d53b39e88c" containerID="59bba4c99056b0209393a5bee617b921f0c3ad559988d553f01f3ca9bab6d300" exitCode=0 Apr 20 13:37:56.401976 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:56.401632 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" event={"ID":"b9df4f7c-6624-41c4-99f7-49d53b39e88c","Type":"ContainerDied","Data":"59bba4c99056b0209393a5bee617b921f0c3ad559988d553f01f3ca9bab6d300"} Apr 20 13:37:57.526468 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:57.526448 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" Apr 20 13:37:57.556268 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:57.556234 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp6jw\" (UniqueName: \"kubernetes.io/projected/b9df4f7c-6624-41c4-99f7-49d53b39e88c-kube-api-access-sp6jw\") pod \"b9df4f7c-6624-41c4-99f7-49d53b39e88c\" (UID: \"b9df4f7c-6624-41c4-99f7-49d53b39e88c\") " Apr 20 13:37:57.556421 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:57.556288 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9df4f7c-6624-41c4-99f7-49d53b39e88c-bundle\") pod \"b9df4f7c-6624-41c4-99f7-49d53b39e88c\" (UID: \"b9df4f7c-6624-41c4-99f7-49d53b39e88c\") " Apr 20 13:37:57.556421 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:57.556326 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9df4f7c-6624-41c4-99f7-49d53b39e88c-util\") pod \"b9df4f7c-6624-41c4-99f7-49d53b39e88c\" (UID: \"b9df4f7c-6624-41c4-99f7-49d53b39e88c\") " Apr 20 13:37:57.557223 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:57.557193 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9df4f7c-6624-41c4-99f7-49d53b39e88c-bundle" (OuterVolumeSpecName: "bundle") pod "b9df4f7c-6624-41c4-99f7-49d53b39e88c" (UID: "b9df4f7c-6624-41c4-99f7-49d53b39e88c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:37:57.558413 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:57.558379 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9df4f7c-6624-41c4-99f7-49d53b39e88c-kube-api-access-sp6jw" (OuterVolumeSpecName: "kube-api-access-sp6jw") pod "b9df4f7c-6624-41c4-99f7-49d53b39e88c" (UID: "b9df4f7c-6624-41c4-99f7-49d53b39e88c"). InnerVolumeSpecName "kube-api-access-sp6jw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:37:57.561969 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:57.561948 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9df4f7c-6624-41c4-99f7-49d53b39e88c-util" (OuterVolumeSpecName: "util") pod "b9df4f7c-6624-41c4-99f7-49d53b39e88c" (UID: "b9df4f7c-6624-41c4-99f7-49d53b39e88c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:37:57.657310 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:57.657223 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sp6jw\" (UniqueName: \"kubernetes.io/projected/b9df4f7c-6624-41c4-99f7-49d53b39e88c-kube-api-access-sp6jw\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:37:57.657310 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:57.657261 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9df4f7c-6624-41c4-99f7-49d53b39e88c-bundle\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:37:57.657310 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:57.657272 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9df4f7c-6624-41c4-99f7-49d53b39e88c-util\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:37:58.410227 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:58.410194 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" event={"ID":"b9df4f7c-6624-41c4-99f7-49d53b39e88c","Type":"ContainerDied","Data":"c95950c2f01d34dae14072a9a853bd6464d124f2baafe2702f58080a4ca11f43"} Apr 20 13:37:58.410227 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:58.410226 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c95950c2f01d34dae14072a9a853bd6464d124f2baafe2702f58080a4ca11f43" Apr 20 13:37:58.410479 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:37:58.410258 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22t4lv" Apr 20 13:38:15.941324 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:15.941243 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn"] Apr 20 13:38:15.941752 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:15.941657 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9df4f7c-6624-41c4-99f7-49d53b39e88c" containerName="util" Apr 20 13:38:15.941752 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:15.941673 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9df4f7c-6624-41c4-99f7-49d53b39e88c" containerName="util" Apr 20 13:38:15.941752 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:15.941686 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9df4f7c-6624-41c4-99f7-49d53b39e88c" containerName="pull" Apr 20 13:38:15.941752 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:15.941695 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9df4f7c-6624-41c4-99f7-49d53b39e88c" containerName="pull" Apr 20 13:38:15.941752 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:15.941710 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9df4f7c-6624-41c4-99f7-49d53b39e88c" containerName="extract" Apr 20 13:38:15.941752 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:15.941719 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9df4f7c-6624-41c4-99f7-49d53b39e88c" containerName="extract" Apr 20 13:38:15.941936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:15.941796 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9df4f7c-6624-41c4-99f7-49d53b39e88c" containerName="extract" Apr 20 13:38:15.953977 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:15.953953 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:15.954683 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:15.954660 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn"] Apr 20 13:38:15.956136 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:15.956111 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 13:38:15.956263 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:15.956250 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-s5jg2\"" Apr 20 13:38:16.103792 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.103758 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqwrv\" (UniqueName: \"kubernetes.io/projected/f6293098-abf3-4a59-b6d9-be0f73a7ef51-kube-api-access-tqwrv\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.103792 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.103799 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f6293098-abf3-4a59-b6d9-be0f73a7ef51-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.104013 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.103821 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f6293098-abf3-4a59-b6d9-be0f73a7ef51-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.104013 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.103900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f6293098-abf3-4a59-b6d9-be0f73a7ef51-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.104013 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.103932 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f6293098-abf3-4a59-b6d9-be0f73a7ef51-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.104013 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.103954 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f6293098-abf3-4a59-b6d9-be0f73a7ef51-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.104168 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.104029 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f6293098-abf3-4a59-b6d9-be0f73a7ef51-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.104168 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.104060 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f6293098-abf3-4a59-b6d9-be0f73a7ef51-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.104168 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.104100 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f6293098-abf3-4a59-b6d9-be0f73a7ef51-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.205303 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.205222 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f6293098-abf3-4a59-b6d9-be0f73a7ef51-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.205303 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.205262 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f6293098-abf3-4a59-b6d9-be0f73a7ef51-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.205303 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.205282 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqwrv\" (UniqueName: \"kubernetes.io/projected/f6293098-abf3-4a59-b6d9-be0f73a7ef51-kube-api-access-tqwrv\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.205303 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.205305 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f6293098-abf3-4a59-b6d9-be0f73a7ef51-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.205638 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.205325 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f6293098-abf3-4a59-b6d9-be0f73a7ef51-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.205638 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.205353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f6293098-abf3-4a59-b6d9-be0f73a7ef51-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.205638 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.205383 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f6293098-abf3-4a59-b6d9-be0f73a7ef51-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.205638 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.205412 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f6293098-abf3-4a59-b6d9-be0f73a7ef51-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.205638 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.205466 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f6293098-abf3-4a59-b6d9-be0f73a7ef51-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.205833 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.205780 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f6293098-abf3-4a59-b6d9-be0f73a7ef51-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.205833 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.205810 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f6293098-abf3-4a59-b6d9-be0f73a7ef51-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.205934 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.205901 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f6293098-abf3-4a59-b6d9-be0f73a7ef51-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.205983 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.205928 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f6293098-abf3-4a59-b6d9-be0f73a7ef51-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.206227 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.206205 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f6293098-abf3-4a59-b6d9-be0f73a7ef51-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.207653 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.207634 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f6293098-abf3-4a59-b6d9-be0f73a7ef51-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.207793 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.207775 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f6293098-abf3-4a59-b6d9-be0f73a7ef51-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.212817 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.212795 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f6293098-abf3-4a59-b6d9-be0f73a7ef51-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.212927 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.212873 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqwrv\" (UniqueName: \"kubernetes.io/projected/f6293098-abf3-4a59-b6d9-be0f73a7ef51-kube-api-access-tqwrv\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn\" (UID: \"f6293098-abf3-4a59-b6d9-be0f73a7ef51\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.263948 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.263912 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:16.389747 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.389719 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn"] Apr 20 13:38:16.394283 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:38:16.392394 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6293098_abf3_4a59_b6d9_be0f73a7ef51.slice/crio-87d132af2ea5dcae15bad67aea6811ef85ac54100a9d92fe33b9ac6eb6a9bd83 WatchSource:0}: Error finding container 87d132af2ea5dcae15bad67aea6811ef85ac54100a9d92fe33b9ac6eb6a9bd83: Status 404 returned error can't find the container with id 87d132af2ea5dcae15bad67aea6811ef85ac54100a9d92fe33b9ac6eb6a9bd83 Apr 20 13:38:16.472425 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:16.472343 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" event={"ID":"f6293098-abf3-4a59-b6d9-be0f73a7ef51","Type":"ContainerStarted","Data":"87d132af2ea5dcae15bad67aea6811ef85ac54100a9d92fe33b9ac6eb6a9bd83"} Apr 20 13:38:18.696437 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:18.696399 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 20 13:38:18.696808 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:18.696480 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 20 13:38:18.696808 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:18.696525 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 20 13:38:19.484853 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:19.484818 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" event={"ID":"f6293098-abf3-4a59-b6d9-be0f73a7ef51","Type":"ContainerStarted","Data":"c22f3130367e1c6e877bba59b1bc8d047cbf07772bbf94935e4361734321e05c"} Apr 20 13:38:19.505627 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:19.505574 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" podStartSLOduration=2.205277296 podStartE2EDuration="4.505560053s" podCreationTimestamp="2026-04-20 13:38:15 +0000 UTC" firstStartedPulling="2026-04-20 13:38:16.395851329 +0000 UTC m=+452.991542383" lastFinishedPulling="2026-04-20 13:38:18.696134079 +0000 UTC m=+455.291825140" observedRunningTime="2026-04-20 13:38:19.504495778 +0000 UTC m=+456.100186851" watchObservedRunningTime="2026-04-20 13:38:19.505560053 +0000 UTC m=+456.101251129" Apr 20 13:38:20.264763 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:20.264724 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:20.269425 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:20.269400 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:20.488407 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:20.488380 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:20.489314 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:20.489297 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn" Apr 20 13:38:28.542367 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:28.542332 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-hkqz6"] Apr 20 13:38:28.547155 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:28.547128 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-hkqz6" Apr 20 13:38:28.550273 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:28.550234 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 13:38:28.550273 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:28.550241 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 13:38:28.550438 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:28.550304 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-n2wfl\"" Apr 20 13:38:28.554089 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:28.554066 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-hkqz6"] Apr 20 13:38:28.716320 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:28.716267 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg4rd\" (UniqueName: \"kubernetes.io/projected/e0d78f46-31d3-4935-a8d9-747fd114d65e-kube-api-access-jg4rd\") pod \"kuadrant-operator-catalog-hkqz6\" (UID: \"e0d78f46-31d3-4935-a8d9-747fd114d65e\") " pod="kuadrant-system/kuadrant-operator-catalog-hkqz6" Apr 20 13:38:28.817739 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:28.817653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg4rd\" (UniqueName: \"kubernetes.io/projected/e0d78f46-31d3-4935-a8d9-747fd114d65e-kube-api-access-jg4rd\") pod \"kuadrant-operator-catalog-hkqz6\" (UID: \"e0d78f46-31d3-4935-a8d9-747fd114d65e\") " pod="kuadrant-system/kuadrant-operator-catalog-hkqz6" Apr 20 13:38:28.826475 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:28.826446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg4rd\" (UniqueName: \"kubernetes.io/projected/e0d78f46-31d3-4935-a8d9-747fd114d65e-kube-api-access-jg4rd\") pod \"kuadrant-operator-catalog-hkqz6\" (UID: \"e0d78f46-31d3-4935-a8d9-747fd114d65e\") " pod="kuadrant-system/kuadrant-operator-catalog-hkqz6" Apr 20 13:38:28.859219 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:28.859190 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-hkqz6" Apr 20 13:38:28.902213 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:28.902181 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-hkqz6"] Apr 20 13:38:28.984790 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:28.984744 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-hkqz6"] Apr 20 13:38:28.986976 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:38:28.986941 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0d78f46_31d3_4935_a8d9_747fd114d65e.slice/crio-af0f3a9bdc8f2212b7defb9f21c15f1a0e0357bcfc35ea04a98132b088ca4cfc WatchSource:0}: Error finding container af0f3a9bdc8f2212b7defb9f21c15f1a0e0357bcfc35ea04a98132b088ca4cfc: Status 404 returned error can't find the container with id af0f3a9bdc8f2212b7defb9f21c15f1a0e0357bcfc35ea04a98132b088ca4cfc Apr 20 13:38:29.108220 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:29.108160 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-k5wnq"] Apr 20 13:38:29.113087 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:29.113072 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-k5wnq" Apr 20 13:38:29.119493 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:29.119467 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-k5wnq"] Apr 20 13:38:29.220764 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:29.220735 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5prr6\" (UniqueName: \"kubernetes.io/projected/ac6b711e-d59d-469a-b1e5-276940ec1a43-kube-api-access-5prr6\") pod \"kuadrant-operator-catalog-k5wnq\" (UID: \"ac6b711e-d59d-469a-b1e5-276940ec1a43\") " pod="kuadrant-system/kuadrant-operator-catalog-k5wnq" Apr 20 13:38:29.321159 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:29.321113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5prr6\" (UniqueName: \"kubernetes.io/projected/ac6b711e-d59d-469a-b1e5-276940ec1a43-kube-api-access-5prr6\") pod \"kuadrant-operator-catalog-k5wnq\" (UID: \"ac6b711e-d59d-469a-b1e5-276940ec1a43\") " pod="kuadrant-system/kuadrant-operator-catalog-k5wnq" Apr 20 13:38:29.331412 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:29.331387 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5prr6\" (UniqueName: \"kubernetes.io/projected/ac6b711e-d59d-469a-b1e5-276940ec1a43-kube-api-access-5prr6\") pod \"kuadrant-operator-catalog-k5wnq\" (UID: \"ac6b711e-d59d-469a-b1e5-276940ec1a43\") " pod="kuadrant-system/kuadrant-operator-catalog-k5wnq" Apr 20 13:38:29.423628 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:29.423561 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-k5wnq" Apr 20 13:38:29.520293 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:29.520253 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-hkqz6" event={"ID":"e0d78f46-31d3-4935-a8d9-747fd114d65e","Type":"ContainerStarted","Data":"af0f3a9bdc8f2212b7defb9f21c15f1a0e0357bcfc35ea04a98132b088ca4cfc"} Apr 20 13:38:29.549187 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:29.549134 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-k5wnq"] Apr 20 13:38:29.554623 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:38:29.554594 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac6b711e_d59d_469a_b1e5_276940ec1a43.slice/crio-de9601824521a20cd2c0f8773031ec01131c29a18f56cd0799dca1fc695dc209 WatchSource:0}: Error finding container de9601824521a20cd2c0f8773031ec01131c29a18f56cd0799dca1fc695dc209: Status 404 returned error can't find the container with id de9601824521a20cd2c0f8773031ec01131c29a18f56cd0799dca1fc695dc209 Apr 20 13:38:30.525495 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:30.525448 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-k5wnq" event={"ID":"ac6b711e-d59d-469a-b1e5-276940ec1a43","Type":"ContainerStarted","Data":"de9601824521a20cd2c0f8773031ec01131c29a18f56cd0799dca1fc695dc209"} Apr 20 13:38:31.530779 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:31.530738 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-hkqz6" event={"ID":"e0d78f46-31d3-4935-a8d9-747fd114d65e","Type":"ContainerStarted","Data":"3849e2ccebaf6c7414deda64e8252bb90213afde1bcaf7e5c34556674a91d787"} Apr 20 13:38:31.531249 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:31.530800 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-hkqz6" podUID="e0d78f46-31d3-4935-a8d9-747fd114d65e" containerName="registry-server" containerID="cri-o://3849e2ccebaf6c7414deda64e8252bb90213afde1bcaf7e5c34556674a91d787" gracePeriod=2 Apr 20 13:38:31.532155 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:31.532111 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-k5wnq" event={"ID":"ac6b711e-d59d-469a-b1e5-276940ec1a43","Type":"ContainerStarted","Data":"dcffd99247c607ba76726968955304384ac04bc51bc4fcca29e1acb3b15bd833"} Apr 20 13:38:31.546374 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:31.546330 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-hkqz6" podStartSLOduration=1.21032506 podStartE2EDuration="3.546316364s" podCreationTimestamp="2026-04-20 13:38:28 +0000 UTC" firstStartedPulling="2026-04-20 13:38:28.988317022 +0000 UTC m=+465.584008090" lastFinishedPulling="2026-04-20 13:38:31.324308336 +0000 UTC m=+467.919999394" observedRunningTime="2026-04-20 13:38:31.545369002 +0000 UTC m=+468.141060077" watchObservedRunningTime="2026-04-20 13:38:31.546316364 +0000 UTC m=+468.142007440" Apr 20 13:38:31.560735 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:31.560665 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-k5wnq" podStartSLOduration=0.791108088 podStartE2EDuration="2.560645198s" podCreationTimestamp="2026-04-20 13:38:29 +0000 UTC" firstStartedPulling="2026-04-20 13:38:29.5559028 +0000 UTC m=+466.151593855" lastFinishedPulling="2026-04-20 13:38:31.325439897 +0000 UTC m=+467.921130965" observedRunningTime="2026-04-20 13:38:31.560636083 +0000 UTC m=+468.156327160" watchObservedRunningTime="2026-04-20 13:38:31.560645198 +0000 UTC m=+468.156336277" Apr 20 13:38:31.768896 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:31.768870 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-hkqz6" Apr 20 13:38:31.945411 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:31.945341 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg4rd\" (UniqueName: \"kubernetes.io/projected/e0d78f46-31d3-4935-a8d9-747fd114d65e-kube-api-access-jg4rd\") pod \"e0d78f46-31d3-4935-a8d9-747fd114d65e\" (UID: \"e0d78f46-31d3-4935-a8d9-747fd114d65e\") " Apr 20 13:38:31.947450 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:31.947419 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0d78f46-31d3-4935-a8d9-747fd114d65e-kube-api-access-jg4rd" (OuterVolumeSpecName: "kube-api-access-jg4rd") pod "e0d78f46-31d3-4935-a8d9-747fd114d65e" (UID: "e0d78f46-31d3-4935-a8d9-747fd114d65e"). InnerVolumeSpecName "kube-api-access-jg4rd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:38:32.046848 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:32.046819 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jg4rd\" (UniqueName: \"kubernetes.io/projected/e0d78f46-31d3-4935-a8d9-747fd114d65e-kube-api-access-jg4rd\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:38:32.536931 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:32.536893 2573 generic.go:358] "Generic (PLEG): container finished" podID="e0d78f46-31d3-4935-a8d9-747fd114d65e" containerID="3849e2ccebaf6c7414deda64e8252bb90213afde1bcaf7e5c34556674a91d787" exitCode=0 Apr 20 13:38:32.537332 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:32.536955 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-hkqz6" Apr 20 13:38:32.537332 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:32.536976 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-hkqz6" event={"ID":"e0d78f46-31d3-4935-a8d9-747fd114d65e","Type":"ContainerDied","Data":"3849e2ccebaf6c7414deda64e8252bb90213afde1bcaf7e5c34556674a91d787"} Apr 20 13:38:32.537332 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:32.537017 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-hkqz6" event={"ID":"e0d78f46-31d3-4935-a8d9-747fd114d65e","Type":"ContainerDied","Data":"af0f3a9bdc8f2212b7defb9f21c15f1a0e0357bcfc35ea04a98132b088ca4cfc"} Apr 20 13:38:32.537332 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:32.537033 2573 scope.go:117] "RemoveContainer" containerID="3849e2ccebaf6c7414deda64e8252bb90213afde1bcaf7e5c34556674a91d787" Apr 20 13:38:32.545552 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:32.545534 2573 scope.go:117] "RemoveContainer" containerID="3849e2ccebaf6c7414deda64e8252bb90213afde1bcaf7e5c34556674a91d787" Apr 20 13:38:32.545788 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:38:32.545766 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3849e2ccebaf6c7414deda64e8252bb90213afde1bcaf7e5c34556674a91d787\": container with ID starting with 3849e2ccebaf6c7414deda64e8252bb90213afde1bcaf7e5c34556674a91d787 not found: ID does not exist" containerID="3849e2ccebaf6c7414deda64e8252bb90213afde1bcaf7e5c34556674a91d787" Apr 20 13:38:32.545865 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:32.545802 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3849e2ccebaf6c7414deda64e8252bb90213afde1bcaf7e5c34556674a91d787"} err="failed to get container status \"3849e2ccebaf6c7414deda64e8252bb90213afde1bcaf7e5c34556674a91d787\": rpc error: code = NotFound desc = could not find container \"3849e2ccebaf6c7414deda64e8252bb90213afde1bcaf7e5c34556674a91d787\": container with ID starting with 3849e2ccebaf6c7414deda64e8252bb90213afde1bcaf7e5c34556674a91d787 not found: ID does not exist" Apr 20 13:38:32.553462 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:32.553437 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-hkqz6"] Apr 20 13:38:32.555068 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:32.555050 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-hkqz6"] Apr 20 13:38:33.983161 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:33.980873 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0d78f46-31d3-4935-a8d9-747fd114d65e" path="/var/lib/kubelet/pods/e0d78f46-31d3-4935-a8d9-747fd114d65e/volumes" Apr 20 13:38:39.424343 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:39.424309 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-k5wnq" Apr 20 13:38:39.424343 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:39.424348 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-k5wnq" Apr 20 13:38:39.445607 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:39.445577 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-k5wnq" Apr 20 13:38:39.581065 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:39.581040 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-k5wnq" Apr 20 13:38:40.760765 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.760726 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8"] Apr 20 13:38:40.761129 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.761052 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0d78f46-31d3-4935-a8d9-747fd114d65e" containerName="registry-server" Apr 20 13:38:40.761129 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.761066 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d78f46-31d3-4935-a8d9-747fd114d65e" containerName="registry-server" Apr 20 13:38:40.761129 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.761124 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0d78f46-31d3-4935-a8d9-747fd114d65e" containerName="registry-server" Apr 20 13:38:40.764276 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.764260 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" Apr 20 13:38:40.766779 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.766761 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-bbcr5\"" Apr 20 13:38:40.773179 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.773156 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8"] Apr 20 13:38:40.818767 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.818739 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ceaf24b-e567-4745-b221-00135c2089ba-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8\" (UID: \"1ceaf24b-e567-4745-b221-00135c2089ba\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" Apr 20 13:38:40.818926 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.818789 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ceaf24b-e567-4745-b221-00135c2089ba-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8\" (UID: \"1ceaf24b-e567-4745-b221-00135c2089ba\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" Apr 20 13:38:40.818926 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.818889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qzkc\" (UniqueName: \"kubernetes.io/projected/1ceaf24b-e567-4745-b221-00135c2089ba-kube-api-access-9qzkc\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8\" (UID: \"1ceaf24b-e567-4745-b221-00135c2089ba\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" Apr 20 13:38:40.919209 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.919174 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ceaf24b-e567-4745-b221-00135c2089ba-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8\" (UID: \"1ceaf24b-e567-4745-b221-00135c2089ba\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" Apr 20 13:38:40.919360 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.919272 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qzkc\" (UniqueName: \"kubernetes.io/projected/1ceaf24b-e567-4745-b221-00135c2089ba-kube-api-access-9qzkc\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8\" (UID: \"1ceaf24b-e567-4745-b221-00135c2089ba\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" Apr 20 13:38:40.919360 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.919320 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ceaf24b-e567-4745-b221-00135c2089ba-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8\" (UID: \"1ceaf24b-e567-4745-b221-00135c2089ba\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" Apr 20 13:38:40.919533 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.919512 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ceaf24b-e567-4745-b221-00135c2089ba-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8\" (UID: \"1ceaf24b-e567-4745-b221-00135c2089ba\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" Apr 20 13:38:40.919658 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.919640 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ceaf24b-e567-4745-b221-00135c2089ba-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8\" (UID: \"1ceaf24b-e567-4745-b221-00135c2089ba\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" Apr 20 13:38:40.927855 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:40.927836 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qzkc\" (UniqueName: \"kubernetes.io/projected/1ceaf24b-e567-4745-b221-00135c2089ba-kube-api-access-9qzkc\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8\" (UID: \"1ceaf24b-e567-4745-b221-00135c2089ba\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" Apr 20 13:38:41.074517 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.074491 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" Apr 20 13:38:41.213111 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.213079 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8"] Apr 20 13:38:41.216896 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:38:41.216865 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ceaf24b_e567_4745_b221_00135c2089ba.slice/crio-170ac15f4f1d67b2162576f636e25a3219aad2ba02e348cbdbe791e86cdc0950 WatchSource:0}: Error finding container 170ac15f4f1d67b2162576f636e25a3219aad2ba02e348cbdbe791e86cdc0950: Status 404 returned error can't find the container with id 170ac15f4f1d67b2162576f636e25a3219aad2ba02e348cbdbe791e86cdc0950 Apr 20 13:38:41.343118 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.343046 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg"] Apr 20 13:38:41.346382 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.346364 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" Apr 20 13:38:41.353956 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.353931 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg"] Apr 20 13:38:41.422403 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.422375 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmc6g\" (UniqueName: \"kubernetes.io/projected/78ed81a2-6689-4285-ba75-0dc159bbc2de-kube-api-access-nmc6g\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg\" (UID: \"78ed81a2-6689-4285-ba75-0dc159bbc2de\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" Apr 20 13:38:41.422542 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.422431 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78ed81a2-6689-4285-ba75-0dc159bbc2de-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg\" (UID: \"78ed81a2-6689-4285-ba75-0dc159bbc2de\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" Apr 20 13:38:41.422542 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.422462 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78ed81a2-6689-4285-ba75-0dc159bbc2de-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg\" (UID: \"78ed81a2-6689-4285-ba75-0dc159bbc2de\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" Apr 20 13:38:41.523339 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.523284 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmc6g\" (UniqueName: \"kubernetes.io/projected/78ed81a2-6689-4285-ba75-0dc159bbc2de-kube-api-access-nmc6g\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg\" (UID: \"78ed81a2-6689-4285-ba75-0dc159bbc2de\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" Apr 20 13:38:41.523536 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.523383 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78ed81a2-6689-4285-ba75-0dc159bbc2de-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg\" (UID: \"78ed81a2-6689-4285-ba75-0dc159bbc2de\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" Apr 20 13:38:41.523536 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.523419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78ed81a2-6689-4285-ba75-0dc159bbc2de-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg\" (UID: \"78ed81a2-6689-4285-ba75-0dc159bbc2de\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" Apr 20 13:38:41.523770 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.523751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78ed81a2-6689-4285-ba75-0dc159bbc2de-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg\" (UID: \"78ed81a2-6689-4285-ba75-0dc159bbc2de\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" Apr 20 13:38:41.523818 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.523773 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78ed81a2-6689-4285-ba75-0dc159bbc2de-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg\" (UID: \"78ed81a2-6689-4285-ba75-0dc159bbc2de\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" Apr 20 13:38:41.531674 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.531651 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmc6g\" (UniqueName: \"kubernetes.io/projected/78ed81a2-6689-4285-ba75-0dc159bbc2de-kube-api-access-nmc6g\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg\" (UID: \"78ed81a2-6689-4285-ba75-0dc159bbc2de\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" Apr 20 13:38:41.567468 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.567433 2573 generic.go:358] "Generic (PLEG): container finished" podID="1ceaf24b-e567-4745-b221-00135c2089ba" containerID="aca5692107db3a1dcb8bc34412e895b341b13e21d702643b188ef5076c538ce3" exitCode=0 Apr 20 13:38:41.567588 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.567485 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" event={"ID":"1ceaf24b-e567-4745-b221-00135c2089ba","Type":"ContainerDied","Data":"aca5692107db3a1dcb8bc34412e895b341b13e21d702643b188ef5076c538ce3"} Apr 20 13:38:41.567588 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.567511 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" event={"ID":"1ceaf24b-e567-4745-b221-00135c2089ba","Type":"ContainerStarted","Data":"170ac15f4f1d67b2162576f636e25a3219aad2ba02e348cbdbe791e86cdc0950"} Apr 20 13:38:41.687379 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.687284 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" Apr 20 13:38:41.808324 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.808294 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg"] Apr 20 13:38:41.811556 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:38:41.811526 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78ed81a2_6689_4285_ba75_0dc159bbc2de.slice/crio-c7ad7e43817ac23895632c197be44dca50a0b9b5b2b537f9705974958a99731e WatchSource:0}: Error finding container c7ad7e43817ac23895632c197be44dca50a0b9b5b2b537f9705974958a99731e: Status 404 returned error can't find the container with id c7ad7e43817ac23895632c197be44dca50a0b9b5b2b537f9705974958a99731e Apr 20 13:38:41.953603 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.953576 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l"] Apr 20 13:38:41.956983 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.956966 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" Apr 20 13:38:41.966177 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:41.966133 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l"] Apr 20 13:38:42.028615 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.028574 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4phq\" (UniqueName: \"kubernetes.io/projected/adf98efc-4526-4e92-9a43-65e8dcb62815-kube-api-access-q4phq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l\" (UID: \"adf98efc-4526-4e92-9a43-65e8dcb62815\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" Apr 20 13:38:42.028773 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.028649 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/adf98efc-4526-4e92-9a43-65e8dcb62815-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l\" (UID: \"adf98efc-4526-4e92-9a43-65e8dcb62815\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" Apr 20 13:38:42.028773 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.028689 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/adf98efc-4526-4e92-9a43-65e8dcb62815-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l\" (UID: \"adf98efc-4526-4e92-9a43-65e8dcb62815\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" Apr 20 13:38:42.129746 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.129707 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/adf98efc-4526-4e92-9a43-65e8dcb62815-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l\" (UID: \"adf98efc-4526-4e92-9a43-65e8dcb62815\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" Apr 20 13:38:42.129893 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.129754 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4phq\" (UniqueName: \"kubernetes.io/projected/adf98efc-4526-4e92-9a43-65e8dcb62815-kube-api-access-q4phq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l\" (UID: \"adf98efc-4526-4e92-9a43-65e8dcb62815\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" Apr 20 13:38:42.129893 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.129806 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/adf98efc-4526-4e92-9a43-65e8dcb62815-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l\" (UID: \"adf98efc-4526-4e92-9a43-65e8dcb62815\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" Apr 20 13:38:42.130124 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.130104 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/adf98efc-4526-4e92-9a43-65e8dcb62815-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l\" (UID: \"adf98efc-4526-4e92-9a43-65e8dcb62815\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" Apr 20 13:38:42.130200 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.130124 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/adf98efc-4526-4e92-9a43-65e8dcb62815-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l\" (UID: \"adf98efc-4526-4e92-9a43-65e8dcb62815\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" Apr 20 13:38:42.137947 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.137924 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4phq\" (UniqueName: \"kubernetes.io/projected/adf98efc-4526-4e92-9a43-65e8dcb62815-kube-api-access-q4phq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l\" (UID: \"adf98efc-4526-4e92-9a43-65e8dcb62815\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" Apr 20 13:38:42.266931 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.266849 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" Apr 20 13:38:42.391224 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.391193 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l"] Apr 20 13:38:42.396849 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:38:42.396820 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadf98efc_4526_4e92_9a43_65e8dcb62815.slice/crio-ff0839e2aeaf025d274acc1791ce95c64e807d660183293276e1194ada9a2fe1 WatchSource:0}: Error finding container ff0839e2aeaf025d274acc1791ce95c64e807d660183293276e1194ada9a2fe1: Status 404 returned error can't find the container with id ff0839e2aeaf025d274acc1791ce95c64e807d660183293276e1194ada9a2fe1 Apr 20 13:38:42.553794 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.553766 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6"] Apr 20 13:38:42.557103 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.557087 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" Apr 20 13:38:42.565040 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.565014 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6"] Apr 20 13:38:42.572359 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.572336 2573 generic.go:358] "Generic (PLEG): container finished" podID="adf98efc-4526-4e92-9a43-65e8dcb62815" containerID="6a879845e453323243755da906293a910d613a61555c848e61e4717cb9c64f83" exitCode=0 Apr 20 13:38:42.572455 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.572396 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" event={"ID":"adf98efc-4526-4e92-9a43-65e8dcb62815","Type":"ContainerDied","Data":"6a879845e453323243755da906293a910d613a61555c848e61e4717cb9c64f83"} Apr 20 13:38:42.572455 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.572414 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" event={"ID":"adf98efc-4526-4e92-9a43-65e8dcb62815","Type":"ContainerStarted","Data":"ff0839e2aeaf025d274acc1791ce95c64e807d660183293276e1194ada9a2fe1"} Apr 20 13:38:42.573848 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.573828 2573 generic.go:358] "Generic (PLEG): container finished" podID="78ed81a2-6689-4285-ba75-0dc159bbc2de" containerID="0966b70da8ec167fdb6b869ae7bbb6d868233db357b66f9d741dc4419896d67c" exitCode=0 Apr 20 13:38:42.573931 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.573896 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" event={"ID":"78ed81a2-6689-4285-ba75-0dc159bbc2de","Type":"ContainerDied","Data":"0966b70da8ec167fdb6b869ae7bbb6d868233db357b66f9d741dc4419896d67c"} Apr 20 13:38:42.573931 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.573922 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" event={"ID":"78ed81a2-6689-4285-ba75-0dc159bbc2de","Type":"ContainerStarted","Data":"c7ad7e43817ac23895632c197be44dca50a0b9b5b2b537f9705974958a99731e"} Apr 20 13:38:42.576110 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.575932 2573 generic.go:358] "Generic (PLEG): container finished" podID="1ceaf24b-e567-4745-b221-00135c2089ba" containerID="b02d104f5353ea8c7acd931fff862c3d64bd1543cc918ebac1c194408755b148" exitCode=0 Apr 20 13:38:42.576110 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.575964 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" event={"ID":"1ceaf24b-e567-4745-b221-00135c2089ba","Type":"ContainerDied","Data":"b02d104f5353ea8c7acd931fff862c3d64bd1543cc918ebac1c194408755b148"} Apr 20 13:38:42.634294 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.634269 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/657232aa-0963-4c20-8a39-ec48e4daf9c3-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6\" (UID: \"657232aa-0963-4c20-8a39-ec48e4daf9c3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" Apr 20 13:38:42.634383 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.634316 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/657232aa-0963-4c20-8a39-ec48e4daf9c3-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6\" (UID: \"657232aa-0963-4c20-8a39-ec48e4daf9c3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" Apr 20 13:38:42.634383 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.634362 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfjb4\" (UniqueName: \"kubernetes.io/projected/657232aa-0963-4c20-8a39-ec48e4daf9c3-kube-api-access-hfjb4\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6\" (UID: \"657232aa-0963-4c20-8a39-ec48e4daf9c3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" Apr 20 13:38:42.735097 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.735069 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/657232aa-0963-4c20-8a39-ec48e4daf9c3-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6\" (UID: \"657232aa-0963-4c20-8a39-ec48e4daf9c3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" Apr 20 13:38:42.735237 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.735112 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/657232aa-0963-4c20-8a39-ec48e4daf9c3-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6\" (UID: \"657232aa-0963-4c20-8a39-ec48e4daf9c3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" Apr 20 13:38:42.735237 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.735188 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfjb4\" (UniqueName: \"kubernetes.io/projected/657232aa-0963-4c20-8a39-ec48e4daf9c3-kube-api-access-hfjb4\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6\" (UID: \"657232aa-0963-4c20-8a39-ec48e4daf9c3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" Apr 20 13:38:42.735428 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.735410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/657232aa-0963-4c20-8a39-ec48e4daf9c3-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6\" (UID: \"657232aa-0963-4c20-8a39-ec48e4daf9c3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" Apr 20 13:38:42.735487 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.735466 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/657232aa-0963-4c20-8a39-ec48e4daf9c3-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6\" (UID: \"657232aa-0963-4c20-8a39-ec48e4daf9c3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" Apr 20 13:38:42.742559 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.742535 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfjb4\" (UniqueName: \"kubernetes.io/projected/657232aa-0963-4c20-8a39-ec48e4daf9c3-kube-api-access-hfjb4\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6\" (UID: \"657232aa-0963-4c20-8a39-ec48e4daf9c3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" Apr 20 13:38:42.924287 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:42.924252 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" Apr 20 13:38:43.046080 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:43.046056 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6"] Apr 20 13:38:43.048498 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:38:43.048467 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod657232aa_0963_4c20_8a39_ec48e4daf9c3.slice/crio-23e0a24b8f55555014d5c3ef02467bcb1c0d46fb37470d14d9f8c1b3ea698667 WatchSource:0}: Error finding container 23e0a24b8f55555014d5c3ef02467bcb1c0d46fb37470d14d9f8c1b3ea698667: Status 404 returned error can't find the container with id 23e0a24b8f55555014d5c3ef02467bcb1c0d46fb37470d14d9f8c1b3ea698667 Apr 20 13:38:43.581338 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:43.581246 2573 generic.go:358] "Generic (PLEG): container finished" podID="657232aa-0963-4c20-8a39-ec48e4daf9c3" containerID="abee5bdb86bc543725f8f782780f723587bc2b046921a5c69bae5c3efc8a7755" exitCode=0 Apr 20 13:38:43.581338 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:43.581323 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" event={"ID":"657232aa-0963-4c20-8a39-ec48e4daf9c3","Type":"ContainerDied","Data":"abee5bdb86bc543725f8f782780f723587bc2b046921a5c69bae5c3efc8a7755"} Apr 20 13:38:43.581570 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:43.581358 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" event={"ID":"657232aa-0963-4c20-8a39-ec48e4daf9c3","Type":"ContainerStarted","Data":"23e0a24b8f55555014d5c3ef02467bcb1c0d46fb37470d14d9f8c1b3ea698667"} Apr 20 13:38:43.583337 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:43.583313 2573 generic.go:358] "Generic (PLEG): container finished" podID="1ceaf24b-e567-4745-b221-00135c2089ba" containerID="1e731160c0abc1543afd668dbe1b82eb16d50e06de60b09e6f27d18d9ff2947e" exitCode=0 Apr 20 13:38:43.583456 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:43.583390 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" event={"ID":"1ceaf24b-e567-4745-b221-00135c2089ba","Type":"ContainerDied","Data":"1e731160c0abc1543afd668dbe1b82eb16d50e06de60b09e6f27d18d9ff2947e"} Apr 20 13:38:43.585008 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:43.584987 2573 generic.go:358] "Generic (PLEG): container finished" podID="adf98efc-4526-4e92-9a43-65e8dcb62815" containerID="b84a63e9191ded1514c2f29b53656549ae1776742c65f14ce101834206e5e316" exitCode=0 Apr 20 13:38:43.585093 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:43.585052 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" event={"ID":"adf98efc-4526-4e92-9a43-65e8dcb62815","Type":"ContainerDied","Data":"b84a63e9191ded1514c2f29b53656549ae1776742c65f14ce101834206e5e316"} Apr 20 13:38:43.586668 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:43.586633 2573 generic.go:358] "Generic (PLEG): container finished" podID="78ed81a2-6689-4285-ba75-0dc159bbc2de" containerID="064e1733c4c973e08776421420e00adba065ff6124fb259760f2e174f97f6e3d" exitCode=0 Apr 20 13:38:43.586728 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:43.586676 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" event={"ID":"78ed81a2-6689-4285-ba75-0dc159bbc2de","Type":"ContainerDied","Data":"064e1733c4c973e08776421420e00adba065ff6124fb259760f2e174f97f6e3d"} Apr 20 13:38:44.592267 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.592232 2573 generic.go:358] "Generic (PLEG): container finished" podID="adf98efc-4526-4e92-9a43-65e8dcb62815" containerID="64c75f6623f1b5cc6c14b4568bcf162f08dc0b1d401ed6abafc78dbe8ce2d256" exitCode=0 Apr 20 13:38:44.592706 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.592305 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" event={"ID":"adf98efc-4526-4e92-9a43-65e8dcb62815","Type":"ContainerDied","Data":"64c75f6623f1b5cc6c14b4568bcf162f08dc0b1d401ed6abafc78dbe8ce2d256"} Apr 20 13:38:44.594099 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.594073 2573 generic.go:358] "Generic (PLEG): container finished" podID="78ed81a2-6689-4285-ba75-0dc159bbc2de" containerID="3dda43d902cfaced2bdf49b1ba4118a048de3132d1c1b613173499b4bcd43e28" exitCode=0 Apr 20 13:38:44.594257 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.594160 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" event={"ID":"78ed81a2-6689-4285-ba75-0dc159bbc2de","Type":"ContainerDied","Data":"3dda43d902cfaced2bdf49b1ba4118a048de3132d1c1b613173499b4bcd43e28"} Apr 20 13:38:44.595531 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.595511 2573 generic.go:358] "Generic (PLEG): container finished" podID="657232aa-0963-4c20-8a39-ec48e4daf9c3" containerID="5a1ef5fc3d7b6edafbe8cf707b5a3f4dac4cebd4bce7e7098eef9b38572b41e9" exitCode=0 Apr 20 13:38:44.595615 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.595595 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" event={"ID":"657232aa-0963-4c20-8a39-ec48e4daf9c3","Type":"ContainerDied","Data":"5a1ef5fc3d7b6edafbe8cf707b5a3f4dac4cebd4bce7e7098eef9b38572b41e9"} Apr 20 13:38:44.722214 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.722189 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" Apr 20 13:38:44.750788 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.750759 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ceaf24b-e567-4745-b221-00135c2089ba-util\") pod \"1ceaf24b-e567-4745-b221-00135c2089ba\" (UID: \"1ceaf24b-e567-4745-b221-00135c2089ba\") " Apr 20 13:38:44.750906 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.750828 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ceaf24b-e567-4745-b221-00135c2089ba-bundle\") pod \"1ceaf24b-e567-4745-b221-00135c2089ba\" (UID: \"1ceaf24b-e567-4745-b221-00135c2089ba\") " Apr 20 13:38:44.750951 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.750908 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qzkc\" (UniqueName: \"kubernetes.io/projected/1ceaf24b-e567-4745-b221-00135c2089ba-kube-api-access-9qzkc\") pod \"1ceaf24b-e567-4745-b221-00135c2089ba\" (UID: \"1ceaf24b-e567-4745-b221-00135c2089ba\") " Apr 20 13:38:44.751398 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.751358 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ceaf24b-e567-4745-b221-00135c2089ba-bundle" (OuterVolumeSpecName: "bundle") pod "1ceaf24b-e567-4745-b221-00135c2089ba" (UID: "1ceaf24b-e567-4745-b221-00135c2089ba"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:38:44.752846 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.752823 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ceaf24b-e567-4745-b221-00135c2089ba-kube-api-access-9qzkc" (OuterVolumeSpecName: "kube-api-access-9qzkc") pod "1ceaf24b-e567-4745-b221-00135c2089ba" (UID: "1ceaf24b-e567-4745-b221-00135c2089ba"). InnerVolumeSpecName "kube-api-access-9qzkc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:38:44.755767 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.755739 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ceaf24b-e567-4745-b221-00135c2089ba-util" (OuterVolumeSpecName: "util") pod "1ceaf24b-e567-4745-b221-00135c2089ba" (UID: "1ceaf24b-e567-4745-b221-00135c2089ba"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:38:44.851565 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.851522 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ceaf24b-e567-4745-b221-00135c2089ba-util\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:38:44.851565 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.851557 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ceaf24b-e567-4745-b221-00135c2089ba-bundle\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:38:44.851565 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:44.851570 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9qzkc\" (UniqueName: \"kubernetes.io/projected/1ceaf24b-e567-4745-b221-00135c2089ba-kube-api-access-9qzkc\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:38:45.432819 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.432786 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d4d8b4658-mbq2h"] Apr 20 13:38:45.433105 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.433091 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ceaf24b-e567-4745-b221-00135c2089ba" containerName="extract" Apr 20 13:38:45.433105 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.433105 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ceaf24b-e567-4745-b221-00135c2089ba" containerName="extract" Apr 20 13:38:45.433277 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.433119 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ceaf24b-e567-4745-b221-00135c2089ba" containerName="util" Apr 20 13:38:45.433277 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.433124 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ceaf24b-e567-4745-b221-00135c2089ba" containerName="util" Apr 20 13:38:45.433277 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.433153 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ceaf24b-e567-4745-b221-00135c2089ba" containerName="pull" Apr 20 13:38:45.433277 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.433162 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ceaf24b-e567-4745-b221-00135c2089ba" containerName="pull" Apr 20 13:38:45.433277 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.433246 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ceaf24b-e567-4745-b221-00135c2089ba" containerName="extract" Apr 20 13:38:45.440458 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.440430 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.448447 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.448415 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d4d8b4658-mbq2h"] Apr 20 13:38:45.455020 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.454977 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c32770-7d05-4a4f-83e9-7137693639ad-console-serving-cert\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.455134 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.455035 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl74q\" (UniqueName: \"kubernetes.io/projected/f2c32770-7d05-4a4f-83e9-7137693639ad-kube-api-access-zl74q\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.455134 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.455083 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2c32770-7d05-4a4f-83e9-7137693639ad-service-ca\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.455134 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.455127 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2c32770-7d05-4a4f-83e9-7137693639ad-oauth-serving-cert\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.455269 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.455199 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2c32770-7d05-4a4f-83e9-7137693639ad-console-config\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.455269 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.455244 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2c32770-7d05-4a4f-83e9-7137693639ad-trusted-ca-bundle\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.455351 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.455290 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2c32770-7d05-4a4f-83e9-7137693639ad-console-oauth-config\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.555739 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.555705 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2c32770-7d05-4a4f-83e9-7137693639ad-console-config\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.555739 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.555748 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2c32770-7d05-4a4f-83e9-7137693639ad-trusted-ca-bundle\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.555954 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.555779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2c32770-7d05-4a4f-83e9-7137693639ad-console-oauth-config\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.555954 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.555823 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c32770-7d05-4a4f-83e9-7137693639ad-console-serving-cert\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.555954 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.555852 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zl74q\" (UniqueName: \"kubernetes.io/projected/f2c32770-7d05-4a4f-83e9-7137693639ad-kube-api-access-zl74q\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.556348 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.556259 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2c32770-7d05-4a4f-83e9-7137693639ad-service-ca\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.556923 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.556768 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2c32770-7d05-4a4f-83e9-7137693639ad-oauth-serving-cert\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.556923 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.556832 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2c32770-7d05-4a4f-83e9-7137693639ad-console-config\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.557102 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.557012 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2c32770-7d05-4a4f-83e9-7137693639ad-service-ca\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.557102 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.557078 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2c32770-7d05-4a4f-83e9-7137693639ad-trusted-ca-bundle\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.563165 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.557663 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2c32770-7d05-4a4f-83e9-7137693639ad-oauth-serving-cert\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.563165 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.559492 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2c32770-7d05-4a4f-83e9-7137693639ad-console-oauth-config\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.564095 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.564077 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c32770-7d05-4a4f-83e9-7137693639ad-console-serving-cert\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.571415 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.571391 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl74q\" (UniqueName: \"kubernetes.io/projected/f2c32770-7d05-4a4f-83e9-7137693639ad-kube-api-access-zl74q\") pod \"console-d4d8b4658-mbq2h\" (UID: \"f2c32770-7d05-4a4f-83e9-7137693639ad\") " pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.601098 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.601063 2573 generic.go:358] "Generic (PLEG): container finished" podID="657232aa-0963-4c20-8a39-ec48e4daf9c3" containerID="bb19839bc06eaa4a874f6cc4004fbb92d64062ca4b8f0b73f556d65d6ff1d7bd" exitCode=0 Apr 20 13:38:45.601500 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.601154 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" event={"ID":"657232aa-0963-4c20-8a39-ec48e4daf9c3","Type":"ContainerDied","Data":"bb19839bc06eaa4a874f6cc4004fbb92d64062ca4b8f0b73f556d65d6ff1d7bd"} Apr 20 13:38:45.602698 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.602684 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" Apr 20 13:38:45.602749 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.602693 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8" event={"ID":"1ceaf24b-e567-4745-b221-00135c2089ba","Type":"ContainerDied","Data":"170ac15f4f1d67b2162576f636e25a3219aad2ba02e348cbdbe791e86cdc0950"} Apr 20 13:38:45.602749 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.602728 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="170ac15f4f1d67b2162576f636e25a3219aad2ba02e348cbdbe791e86cdc0950" Apr 20 13:38:45.725009 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.724987 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" Apr 20 13:38:45.735766 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.735749 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" Apr 20 13:38:45.752077 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.752056 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:45.757916 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.757889 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/adf98efc-4526-4e92-9a43-65e8dcb62815-util\") pod \"adf98efc-4526-4e92-9a43-65e8dcb62815\" (UID: \"adf98efc-4526-4e92-9a43-65e8dcb62815\") " Apr 20 13:38:45.758032 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.757941 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78ed81a2-6689-4285-ba75-0dc159bbc2de-util\") pod \"78ed81a2-6689-4285-ba75-0dc159bbc2de\" (UID: \"78ed81a2-6689-4285-ba75-0dc159bbc2de\") " Apr 20 13:38:45.758032 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.757983 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmc6g\" (UniqueName: \"kubernetes.io/projected/78ed81a2-6689-4285-ba75-0dc159bbc2de-kube-api-access-nmc6g\") pod \"78ed81a2-6689-4285-ba75-0dc159bbc2de\" (UID: \"78ed81a2-6689-4285-ba75-0dc159bbc2de\") " Apr 20 13:38:45.758032 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.758002 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78ed81a2-6689-4285-ba75-0dc159bbc2de-bundle\") pod \"78ed81a2-6689-4285-ba75-0dc159bbc2de\" (UID: \"78ed81a2-6689-4285-ba75-0dc159bbc2de\") " Apr 20 13:38:45.758032 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.758029 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4phq\" (UniqueName: \"kubernetes.io/projected/adf98efc-4526-4e92-9a43-65e8dcb62815-kube-api-access-q4phq\") pod \"adf98efc-4526-4e92-9a43-65e8dcb62815\" (UID: \"adf98efc-4526-4e92-9a43-65e8dcb62815\") " Apr 20 13:38:45.758346 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.758171 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/adf98efc-4526-4e92-9a43-65e8dcb62815-bundle\") pod \"adf98efc-4526-4e92-9a43-65e8dcb62815\" (UID: \"adf98efc-4526-4e92-9a43-65e8dcb62815\") " Apr 20 13:38:45.759928 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.759391 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adf98efc-4526-4e92-9a43-65e8dcb62815-bundle" (OuterVolumeSpecName: "bundle") pod "adf98efc-4526-4e92-9a43-65e8dcb62815" (UID: "adf98efc-4526-4e92-9a43-65e8dcb62815"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:38:45.761291 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.761085 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78ed81a2-6689-4285-ba75-0dc159bbc2de-bundle" (OuterVolumeSpecName: "bundle") pod "78ed81a2-6689-4285-ba75-0dc159bbc2de" (UID: "78ed81a2-6689-4285-ba75-0dc159bbc2de"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:38:45.764714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.764657 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf98efc-4526-4e92-9a43-65e8dcb62815-kube-api-access-q4phq" (OuterVolumeSpecName: "kube-api-access-q4phq") pod "adf98efc-4526-4e92-9a43-65e8dcb62815" (UID: "adf98efc-4526-4e92-9a43-65e8dcb62815"). InnerVolumeSpecName "kube-api-access-q4phq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:38:45.765047 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.765023 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78ed81a2-6689-4285-ba75-0dc159bbc2de-kube-api-access-nmc6g" (OuterVolumeSpecName: "kube-api-access-nmc6g") pod "78ed81a2-6689-4285-ba75-0dc159bbc2de" (UID: "78ed81a2-6689-4285-ba75-0dc159bbc2de"). InnerVolumeSpecName "kube-api-access-nmc6g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:38:45.765540 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.765521 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adf98efc-4526-4e92-9a43-65e8dcb62815-util" (OuterVolumeSpecName: "util") pod "adf98efc-4526-4e92-9a43-65e8dcb62815" (UID: "adf98efc-4526-4e92-9a43-65e8dcb62815"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:38:45.766634 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.766592 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78ed81a2-6689-4285-ba75-0dc159bbc2de-util" (OuterVolumeSpecName: "util") pod "78ed81a2-6689-4285-ba75-0dc159bbc2de" (UID: "78ed81a2-6689-4285-ba75-0dc159bbc2de"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:38:45.859874 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.859848 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmc6g\" (UniqueName: \"kubernetes.io/projected/78ed81a2-6689-4285-ba75-0dc159bbc2de-kube-api-access-nmc6g\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:38:45.859874 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.859873 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78ed81a2-6689-4285-ba75-0dc159bbc2de-bundle\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:38:45.860047 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.859885 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q4phq\" (UniqueName: \"kubernetes.io/projected/adf98efc-4526-4e92-9a43-65e8dcb62815-kube-api-access-q4phq\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:38:45.860047 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.859895 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/adf98efc-4526-4e92-9a43-65e8dcb62815-bundle\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:38:45.860047 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.859905 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/adf98efc-4526-4e92-9a43-65e8dcb62815-util\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:38:45.860047 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.859916 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78ed81a2-6689-4285-ba75-0dc159bbc2de-util\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:38:45.877599 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:45.877572 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d4d8b4658-mbq2h"] Apr 20 13:38:45.878896 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:38:45.878875 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2c32770_7d05_4a4f_83e9_7137693639ad.slice/crio-b49f970cab6f17007fefe2833f56a62b607f1f82b16bd3da7690329c7c28e80b WatchSource:0}: Error finding container b49f970cab6f17007fefe2833f56a62b607f1f82b16bd3da7690329c7c28e80b: Status 404 returned error can't find the container with id b49f970cab6f17007fefe2833f56a62b607f1f82b16bd3da7690329c7c28e80b Apr 20 13:38:46.607942 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.607904 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4d8b4658-mbq2h" event={"ID":"f2c32770-7d05-4a4f-83e9-7137693639ad","Type":"ContainerStarted","Data":"91f59c2af708a8873cdf2187e270bfb04a9398feb41fb7efc06f7c4974ae6d25"} Apr 20 13:38:46.608425 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.607954 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4d8b4658-mbq2h" event={"ID":"f2c32770-7d05-4a4f-83e9-7137693639ad","Type":"ContainerStarted","Data":"b49f970cab6f17007fefe2833f56a62b607f1f82b16bd3da7690329c7c28e80b"} Apr 20 13:38:46.609673 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.609648 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" event={"ID":"adf98efc-4526-4e92-9a43-65e8dcb62815","Type":"ContainerDied","Data":"ff0839e2aeaf025d274acc1791ce95c64e807d660183293276e1194ada9a2fe1"} Apr 20 13:38:46.609673 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.609669 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l" Apr 20 13:38:46.609859 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.609676 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff0839e2aeaf025d274acc1791ce95c64e807d660183293276e1194ada9a2fe1" Apr 20 13:38:46.611460 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.611445 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" Apr 20 13:38:46.611595 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.611477 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg" event={"ID":"78ed81a2-6689-4285-ba75-0dc159bbc2de","Type":"ContainerDied","Data":"c7ad7e43817ac23895632c197be44dca50a0b9b5b2b537f9705974958a99731e"} Apr 20 13:38:46.611595 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.611503 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7ad7e43817ac23895632c197be44dca50a0b9b5b2b537f9705974958a99731e" Apr 20 13:38:46.625677 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.625626 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d4d8b4658-mbq2h" podStartSLOduration=1.6256084720000001 podStartE2EDuration="1.625608472s" podCreationTimestamp="2026-04-20 13:38:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:38:46.625285905 +0000 UTC m=+483.220976982" watchObservedRunningTime="2026-04-20 13:38:46.625608472 +0000 UTC m=+483.221299579" Apr 20 13:38:46.738736 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.738709 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" Apr 20 13:38:46.767277 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.766890 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/657232aa-0963-4c20-8a39-ec48e4daf9c3-util\") pod \"657232aa-0963-4c20-8a39-ec48e4daf9c3\" (UID: \"657232aa-0963-4c20-8a39-ec48e4daf9c3\") " Apr 20 13:38:46.767277 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.766998 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/657232aa-0963-4c20-8a39-ec48e4daf9c3-bundle\") pod \"657232aa-0963-4c20-8a39-ec48e4daf9c3\" (UID: \"657232aa-0963-4c20-8a39-ec48e4daf9c3\") " Apr 20 13:38:46.767277 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.767029 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfjb4\" (UniqueName: \"kubernetes.io/projected/657232aa-0963-4c20-8a39-ec48e4daf9c3-kube-api-access-hfjb4\") pod \"657232aa-0963-4c20-8a39-ec48e4daf9c3\" (UID: \"657232aa-0963-4c20-8a39-ec48e4daf9c3\") " Apr 20 13:38:46.767522 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.767498 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/657232aa-0963-4c20-8a39-ec48e4daf9c3-bundle" (OuterVolumeSpecName: "bundle") pod "657232aa-0963-4c20-8a39-ec48e4daf9c3" (UID: "657232aa-0963-4c20-8a39-ec48e4daf9c3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:38:46.769733 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.769710 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/657232aa-0963-4c20-8a39-ec48e4daf9c3-kube-api-access-hfjb4" (OuterVolumeSpecName: "kube-api-access-hfjb4") pod "657232aa-0963-4c20-8a39-ec48e4daf9c3" (UID: "657232aa-0963-4c20-8a39-ec48e4daf9c3"). InnerVolumeSpecName "kube-api-access-hfjb4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:38:46.774657 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.774630 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/657232aa-0963-4c20-8a39-ec48e4daf9c3-util" (OuterVolumeSpecName: "util") pod "657232aa-0963-4c20-8a39-ec48e4daf9c3" (UID: "657232aa-0963-4c20-8a39-ec48e4daf9c3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:38:46.867882 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.867803 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/657232aa-0963-4c20-8a39-ec48e4daf9c3-util\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:38:46.867882 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.867829 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/657232aa-0963-4c20-8a39-ec48e4daf9c3-bundle\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:38:46.867882 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:46.867839 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hfjb4\" (UniqueName: \"kubernetes.io/projected/657232aa-0963-4c20-8a39-ec48e4daf9c3-kube-api-access-hfjb4\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:38:47.617102 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:47.617071 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" Apr 20 13:38:47.617502 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:47.617074 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6" event={"ID":"657232aa-0963-4c20-8a39-ec48e4daf9c3","Type":"ContainerDied","Data":"23e0a24b8f55555014d5c3ef02467bcb1c0d46fb37470d14d9f8c1b3ea698667"} Apr 20 13:38:47.617502 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:47.617186 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23e0a24b8f55555014d5c3ef02467bcb1c0d46fb37470d14d9f8c1b3ea698667" Apr 20 13:38:51.527227 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527192 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-blmp5"] Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527481 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="657232aa-0963-4c20-8a39-ec48e4daf9c3" containerName="util" Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527493 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="657232aa-0963-4c20-8a39-ec48e4daf9c3" containerName="util" Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527503 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adf98efc-4526-4e92-9a43-65e8dcb62815" containerName="extract" Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527508 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf98efc-4526-4e92-9a43-65e8dcb62815" containerName="extract" Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527517 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78ed81a2-6689-4285-ba75-0dc159bbc2de" containerName="extract" Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527522 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ed81a2-6689-4285-ba75-0dc159bbc2de" containerName="extract" Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527534 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adf98efc-4526-4e92-9a43-65e8dcb62815" containerName="util" Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527539 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf98efc-4526-4e92-9a43-65e8dcb62815" containerName="util" Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527546 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adf98efc-4526-4e92-9a43-65e8dcb62815" containerName="pull" Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527552 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf98efc-4526-4e92-9a43-65e8dcb62815" containerName="pull" Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527558 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78ed81a2-6689-4285-ba75-0dc159bbc2de" containerName="pull" Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527563 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ed81a2-6689-4285-ba75-0dc159bbc2de" containerName="pull" Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527569 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="657232aa-0963-4c20-8a39-ec48e4daf9c3" containerName="pull" Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527575 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="657232aa-0963-4c20-8a39-ec48e4daf9c3" containerName="pull" Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527590 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="657232aa-0963-4c20-8a39-ec48e4daf9c3" containerName="extract" Apr 20 13:38:51.527611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527608 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="657232aa-0963-4c20-8a39-ec48e4daf9c3" containerName="extract" Apr 20 13:38:51.528075 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527626 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78ed81a2-6689-4285-ba75-0dc159bbc2de" containerName="util" Apr 20 13:38:51.528075 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527633 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ed81a2-6689-4285-ba75-0dc159bbc2de" containerName="util" Apr 20 13:38:51.528075 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527700 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="78ed81a2-6689-4285-ba75-0dc159bbc2de" containerName="extract" Apr 20 13:38:51.528075 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527711 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="adf98efc-4526-4e92-9a43-65e8dcb62815" containerName="extract" Apr 20 13:38:51.528075 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.527718 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="657232aa-0963-4c20-8a39-ec48e4daf9c3" containerName="extract" Apr 20 13:38:51.535874 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.535848 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-blmp5" Apr 20 13:38:51.538936 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.538911 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 13:38:51.539091 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.539066 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-gs8kh\"" Apr 20 13:38:51.554223 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.554202 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-blmp5"] Apr 20 13:38:51.606618 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.606577 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cgtk\" (UniqueName: \"kubernetes.io/projected/90a9993b-28b8-4230-af21-0623d9670090-kube-api-access-6cgtk\") pod \"dns-operator-controller-manager-648d5c98bc-blmp5\" (UID: \"90a9993b-28b8-4230-af21-0623d9670090\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-blmp5" Apr 20 13:38:51.707536 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.707495 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cgtk\" (UniqueName: \"kubernetes.io/projected/90a9993b-28b8-4230-af21-0623d9670090-kube-api-access-6cgtk\") pod \"dns-operator-controller-manager-648d5c98bc-blmp5\" (UID: \"90a9993b-28b8-4230-af21-0623d9670090\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-blmp5" Apr 20 13:38:51.719100 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.719070 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cgtk\" (UniqueName: \"kubernetes.io/projected/90a9993b-28b8-4230-af21-0623d9670090-kube-api-access-6cgtk\") pod \"dns-operator-controller-manager-648d5c98bc-blmp5\" (UID: \"90a9993b-28b8-4230-af21-0623d9670090\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-blmp5" Apr 20 13:38:51.845868 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.845832 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-blmp5" Apr 20 13:38:51.971743 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:51.971715 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-blmp5"] Apr 20 13:38:51.973727 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:38:51.973702 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90a9993b_28b8_4230_af21_0623d9670090.slice/crio-f0cf8b3f77a6118b4cfd826f31c277dc133fb38add254d33be80a31c3ae19b69 WatchSource:0}: Error finding container f0cf8b3f77a6118b4cfd826f31c277dc133fb38add254d33be80a31c3ae19b69: Status 404 returned error can't find the container with id f0cf8b3f77a6118b4cfd826f31c277dc133fb38add254d33be80a31c3ae19b69 Apr 20 13:38:52.636670 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:52.636623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-blmp5" event={"ID":"90a9993b-28b8-4230-af21-0623d9670090","Type":"ContainerStarted","Data":"f0cf8b3f77a6118b4cfd826f31c277dc133fb38add254d33be80a31c3ae19b69"} Apr 20 13:38:54.647656 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:54.647592 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-blmp5" event={"ID":"90a9993b-28b8-4230-af21-0623d9670090","Type":"ContainerStarted","Data":"fad9b96dd0dfa4f919288064607435d9e8b10af14a4efbefa8e5b1f3923980ae"} Apr 20 13:38:54.648041 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:54.647728 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-blmp5" Apr 20 13:38:54.671978 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:54.671919 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-blmp5" podStartSLOduration=1.221306128 podStartE2EDuration="3.671904563s" podCreationTimestamp="2026-04-20 13:38:51 +0000 UTC" firstStartedPulling="2026-04-20 13:38:51.975843054 +0000 UTC m=+488.571534113" lastFinishedPulling="2026-04-20 13:38:54.426441482 +0000 UTC m=+491.022132548" observedRunningTime="2026-04-20 13:38:54.6690469 +0000 UTC m=+491.264737977" watchObservedRunningTime="2026-04-20 13:38:54.671904563 +0000 UTC m=+491.267595697" Apr 20 13:38:55.124654 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:55.124615 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8"] Apr 20 13:38:55.127943 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:55.127927 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" Apr 20 13:38:55.131296 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:55.131273 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-454xx\"" Apr 20 13:38:55.137498 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:55.137479 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twk58\" (UniqueName: \"kubernetes.io/projected/36680ac1-beab-40cb-a5aa-6a4d594bec24-kube-api-access-twk58\") pod \"limitador-operator-controller-manager-85c4996f8c-42lf8\" (UID: \"36680ac1-beab-40cb-a5aa-6a4d594bec24\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" Apr 20 13:38:55.140833 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:55.140804 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8"] Apr 20 13:38:55.238154 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:55.238113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twk58\" (UniqueName: \"kubernetes.io/projected/36680ac1-beab-40cb-a5aa-6a4d594bec24-kube-api-access-twk58\") pod \"limitador-operator-controller-manager-85c4996f8c-42lf8\" (UID: \"36680ac1-beab-40cb-a5aa-6a4d594bec24\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" Apr 20 13:38:55.248972 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:55.248942 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twk58\" (UniqueName: \"kubernetes.io/projected/36680ac1-beab-40cb-a5aa-6a4d594bec24-kube-api-access-twk58\") pod \"limitador-operator-controller-manager-85c4996f8c-42lf8\" (UID: \"36680ac1-beab-40cb-a5aa-6a4d594bec24\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" Apr 20 13:38:55.437997 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:55.437905 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" Apr 20 13:38:55.569479 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:55.569454 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8"] Apr 20 13:38:55.571096 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:38:55.571067 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36680ac1_beab_40cb_a5aa_6a4d594bec24.slice/crio-634caeca85860d19f05b7e90aa7dd96042d8897f369a0afb892bce26edc08da1 WatchSource:0}: Error finding container 634caeca85860d19f05b7e90aa7dd96042d8897f369a0afb892bce26edc08da1: Status 404 returned error can't find the container with id 634caeca85860d19f05b7e90aa7dd96042d8897f369a0afb892bce26edc08da1 Apr 20 13:38:55.653007 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:55.652974 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" event={"ID":"36680ac1-beab-40cb-a5aa-6a4d594bec24","Type":"ContainerStarted","Data":"634caeca85860d19f05b7e90aa7dd96042d8897f369a0afb892bce26edc08da1"} Apr 20 13:38:55.752666 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:55.752581 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:55.752666 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:55.752626 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:55.757320 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:55.757297 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:56.660293 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:56.660263 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d4d8b4658-mbq2h" Apr 20 13:38:56.722760 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:56.722728 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59bd67467f-6rb9f"] Apr 20 13:38:57.662104 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:57.662016 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" event={"ID":"36680ac1-beab-40cb-a5aa-6a4d594bec24","Type":"ContainerStarted","Data":"2c48f0304891e6ebdda154b1ce1435eb4d6104353da8b16a5495704ad74b3f19"} Apr 20 13:38:57.662530 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:57.662234 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" Apr 20 13:38:57.681209 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:38:57.681160 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" podStartSLOduration=0.881064144 podStartE2EDuration="2.681133131s" podCreationTimestamp="2026-04-20 13:38:55 +0000 UTC" firstStartedPulling="2026-04-20 13:38:55.573037107 +0000 UTC m=+492.168728161" lastFinishedPulling="2026-04-20 13:38:57.373106083 +0000 UTC m=+493.968797148" observedRunningTime="2026-04-20 13:38:57.679830267 +0000 UTC m=+494.275521340" watchObservedRunningTime="2026-04-20 13:38:57.681133131 +0000 UTC m=+494.276824206" Apr 20 13:39:03.937990 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:03.937951 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp"] Apr 20 13:39:03.941327 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:03.941307 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" Apr 20 13:39:03.944231 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:03.944201 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-ckqz8\"" Apr 20 13:39:03.945544 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:03.945519 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp"] Apr 20 13:39:04.010614 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:04.010578 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cf65846c-5a0d-4a44-bd90-b55bcfa447ff-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" (UID: \"cf65846c-5a0d-4a44-bd90-b55bcfa447ff\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" Apr 20 13:39:04.010768 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:04.010623 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr88r\" (UniqueName: \"kubernetes.io/projected/cf65846c-5a0d-4a44-bd90-b55bcfa447ff-kube-api-access-rr88r\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" (UID: \"cf65846c-5a0d-4a44-bd90-b55bcfa447ff\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" Apr 20 13:39:04.111678 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:04.111642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cf65846c-5a0d-4a44-bd90-b55bcfa447ff-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" (UID: \"cf65846c-5a0d-4a44-bd90-b55bcfa447ff\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" Apr 20 13:39:04.111678 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:04.111684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rr88r\" (UniqueName: \"kubernetes.io/projected/cf65846c-5a0d-4a44-bd90-b55bcfa447ff-kube-api-access-rr88r\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" (UID: \"cf65846c-5a0d-4a44-bd90-b55bcfa447ff\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" Apr 20 13:39:04.112018 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:04.111997 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cf65846c-5a0d-4a44-bd90-b55bcfa447ff-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" (UID: \"cf65846c-5a0d-4a44-bd90-b55bcfa447ff\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" Apr 20 13:39:04.123254 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:04.123222 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr88r\" (UniqueName: \"kubernetes.io/projected/cf65846c-5a0d-4a44-bd90-b55bcfa447ff-kube-api-access-rr88r\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" (UID: \"cf65846c-5a0d-4a44-bd90-b55bcfa447ff\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" Apr 20 13:39:04.252484 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:04.252391 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" Apr 20 13:39:04.379945 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:04.379925 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp"] Apr 20 13:39:04.382325 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:39:04.382290 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf65846c_5a0d_4a44_bd90_b55bcfa447ff.slice/crio-7205a58623a9fa36ce53cc587341bed5ad5acdc0e603448af5cdd4ae959f7837 WatchSource:0}: Error finding container 7205a58623a9fa36ce53cc587341bed5ad5acdc0e603448af5cdd4ae959f7837: Status 404 returned error can't find the container with id 7205a58623a9fa36ce53cc587341bed5ad5acdc0e603448af5cdd4ae959f7837 Apr 20 13:39:04.686809 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:04.686775 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" event={"ID":"cf65846c-5a0d-4a44-bd90-b55bcfa447ff","Type":"ContainerStarted","Data":"7205a58623a9fa36ce53cc587341bed5ad5acdc0e603448af5cdd4ae959f7837"} Apr 20 13:39:05.655794 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:05.655759 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-blmp5" Apr 20 13:39:08.668791 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:08.668754 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" Apr 20 13:39:09.708624 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:09.708536 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" event={"ID":"cf65846c-5a0d-4a44-bd90-b55bcfa447ff","Type":"ContainerStarted","Data":"db0744ff7b71564b25830a1dd666ad5d51bc9c3dbd853598767d47edee20a672"} Apr 20 13:39:09.708624 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:09.708597 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" Apr 20 13:39:09.737153 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:09.737095 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" podStartSLOduration=1.750546938 podStartE2EDuration="6.737080005s" podCreationTimestamp="2026-04-20 13:39:03 +0000 UTC" firstStartedPulling="2026-04-20 13:39:04.384744941 +0000 UTC m=+500.980435996" lastFinishedPulling="2026-04-20 13:39:09.37127801 +0000 UTC m=+505.966969063" observedRunningTime="2026-04-20 13:39:09.73473178 +0000 UTC m=+506.330422880" watchObservedRunningTime="2026-04-20 13:39:09.737080005 +0000 UTC m=+506.332771123" Apr 20 13:39:20.713921 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:20.713892 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" Apr 20 13:39:21.744731 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:21.744692 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59bd67467f-6rb9f" podUID="a40190b1-f8f5-4c0a-9267-a3f911eae204" containerName="console" containerID="cri-o://233389f51b78118cb18b28235b7c292c2a9754956f69ed794da4fdde0203b33d" gracePeriod=15 Apr 20 13:39:21.992853 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:21.992831 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59bd67467f-6rb9f_a40190b1-f8f5-4c0a-9267-a3f911eae204/console/0.log" Apr 20 13:39:21.992961 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:21.992892 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:39:22.069695 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.069665 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-oauth-config\") pod \"a40190b1-f8f5-4c0a-9267-a3f911eae204\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " Apr 20 13:39:22.069864 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.069704 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-serving-cert\") pod \"a40190b1-f8f5-4c0a-9267-a3f911eae204\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " Apr 20 13:39:22.069864 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.069739 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-oauth-serving-cert\") pod \"a40190b1-f8f5-4c0a-9267-a3f911eae204\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " Apr 20 13:39:22.069864 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.069787 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-service-ca\") pod \"a40190b1-f8f5-4c0a-9267-a3f911eae204\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " Apr 20 13:39:22.069864 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.069820 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqbkt\" (UniqueName: \"kubernetes.io/projected/a40190b1-f8f5-4c0a-9267-a3f911eae204-kube-api-access-nqbkt\") pod \"a40190b1-f8f5-4c0a-9267-a3f911eae204\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " Apr 20 13:39:22.069864 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.069848 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-config\") pod \"a40190b1-f8f5-4c0a-9267-a3f911eae204\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " Apr 20 13:39:22.070128 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.069871 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-trusted-ca-bundle\") pod \"a40190b1-f8f5-4c0a-9267-a3f911eae204\" (UID: \"a40190b1-f8f5-4c0a-9267-a3f911eae204\") " Apr 20 13:39:22.070260 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.070234 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a40190b1-f8f5-4c0a-9267-a3f911eae204" (UID: "a40190b1-f8f5-4c0a-9267-a3f911eae204"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:39:22.070260 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.070230 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-service-ca" (OuterVolumeSpecName: "service-ca") pod "a40190b1-f8f5-4c0a-9267-a3f911eae204" (UID: "a40190b1-f8f5-4c0a-9267-a3f911eae204"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:39:22.070383 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.070358 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a40190b1-f8f5-4c0a-9267-a3f911eae204" (UID: "a40190b1-f8f5-4c0a-9267-a3f911eae204"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:39:22.070627 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.070599 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-config" (OuterVolumeSpecName: "console-config") pod "a40190b1-f8f5-4c0a-9267-a3f911eae204" (UID: "a40190b1-f8f5-4c0a-9267-a3f911eae204"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:39:22.071933 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.071907 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a40190b1-f8f5-4c0a-9267-a3f911eae204" (UID: "a40190b1-f8f5-4c0a-9267-a3f911eae204"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:39:22.072037 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.071966 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a40190b1-f8f5-4c0a-9267-a3f911eae204" (UID: "a40190b1-f8f5-4c0a-9267-a3f911eae204"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:39:22.072037 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.071970 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a40190b1-f8f5-4c0a-9267-a3f911eae204-kube-api-access-nqbkt" (OuterVolumeSpecName: "kube-api-access-nqbkt") pod "a40190b1-f8f5-4c0a-9267-a3f911eae204" (UID: "a40190b1-f8f5-4c0a-9267-a3f911eae204"). InnerVolumeSpecName "kube-api-access-nqbkt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:39:22.171191 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.171131 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-service-ca\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:39:22.171191 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.171187 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nqbkt\" (UniqueName: \"kubernetes.io/projected/a40190b1-f8f5-4c0a-9267-a3f911eae204-kube-api-access-nqbkt\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:39:22.171191 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.171198 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-config\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:39:22.171414 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.171208 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-trusted-ca-bundle\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:39:22.171414 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.171217 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-oauth-config\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:39:22.171414 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.171225 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a40190b1-f8f5-4c0a-9267-a3f911eae204-console-serving-cert\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:39:22.171414 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.171233 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a40190b1-f8f5-4c0a-9267-a3f911eae204-oauth-serving-cert\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:39:22.383245 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.383163 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp"] Apr 20 13:39:22.383422 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.383384 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" containerName="manager" containerID="cri-o://db0744ff7b71564b25830a1dd666ad5d51bc9c3dbd853598767d47edee20a672" gracePeriod=2 Apr 20 13:39:22.390672 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.390645 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp"] Apr 20 13:39:22.415291 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.415257 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8"] Apr 20 13:39:22.415609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.415559 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" containerName="manager" containerID="cri-o://2c48f0304891e6ebdda154b1ce1435eb4d6104353da8b16a5495704ad74b3f19" gracePeriod=2 Apr 20 13:39:22.423632 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.423605 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9"] Apr 20 13:39:22.423979 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.423962 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a40190b1-f8f5-4c0a-9267-a3f911eae204" containerName="console" Apr 20 13:39:22.424057 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.423982 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a40190b1-f8f5-4c0a-9267-a3f911eae204" containerName="console" Apr 20 13:39:22.424057 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.424007 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" containerName="manager" Apr 20 13:39:22.424057 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.424016 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" containerName="manager" Apr 20 13:39:22.424224 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.424092 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a40190b1-f8f5-4c0a-9267-a3f911eae204" containerName="console" Apr 20 13:39:22.424224 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.424106 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" containerName="manager" Apr 20 13:39:22.427095 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.427068 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" Apr 20 13:39:22.428841 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.428810 2573 status_manager.go:895] "Failed to get status for pod" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:22.430655 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.430634 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8"] Apr 20 13:39:22.431164 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.431119 2573 status_manager.go:895] "Failed to get status for pod" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:22.433334 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.433312 2573 status_manager.go:895] "Failed to get status for pod" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" err="pods \"limitador-operator-controller-manager-85c4996f8c-42lf8\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:22.443239 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.443216 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9"] Apr 20 13:39:22.447661 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.447640 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fglm2"] Apr 20 13:39:22.447950 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.447938 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" containerName="manager" Apr 20 13:39:22.447996 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.447952 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" containerName="manager" Apr 20 13:39:22.448030 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.448017 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" containerName="manager" Apr 20 13:39:22.450946 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.450932 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fglm2" Apr 20 13:39:22.465352 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.465330 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fglm2"] Apr 20 13:39:22.516152 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.516104 2573 status_manager.go:895] "Failed to get status for pod" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:22.518244 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.518221 2573 status_manager.go:895] "Failed to get status for pod" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" err="pods \"limitador-operator-controller-manager-85c4996f8c-42lf8\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:22.574620 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.574584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7mhd\" (UniqueName: \"kubernetes.io/projected/d4d3eef1-43c9-40d8-8713-6556921727e4-kube-api-access-f7mhd\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5v7p9\" (UID: \"d4d3eef1-43c9-40d8-8713-6556921727e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" Apr 20 13:39:22.574759 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.574666 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4d3eef1-43c9-40d8-8713-6556921727e4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5v7p9\" (UID: \"d4d3eef1-43c9-40d8-8713-6556921727e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" Apr 20 13:39:22.574759 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.574706 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppddp\" (UniqueName: \"kubernetes.io/projected/f731df91-8b8d-4fa5-9e6e-8486c58fbcaa-kube-api-access-ppddp\") pod \"limitador-operator-controller-manager-85c4996f8c-fglm2\" (UID: \"f731df91-8b8d-4fa5-9e6e-8486c58fbcaa\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fglm2" Apr 20 13:39:22.647921 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.647896 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" Apr 20 13:39:22.650295 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.650262 2573 status_manager.go:895] "Failed to get status for pod" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:22.651273 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.651257 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" Apr 20 13:39:22.652316 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.652293 2573 status_manager.go:895] "Failed to get status for pod" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" err="pods \"limitador-operator-controller-manager-85c4996f8c-42lf8\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:22.654421 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.654400 2573 status_manager.go:895] "Failed to get status for pod" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:22.656464 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.656443 2573 status_manager.go:895] "Failed to get status for pod" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" err="pods \"limitador-operator-controller-manager-85c4996f8c-42lf8\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:22.675501 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.675476 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7mhd\" (UniqueName: \"kubernetes.io/projected/d4d3eef1-43c9-40d8-8713-6556921727e4-kube-api-access-f7mhd\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5v7p9\" (UID: \"d4d3eef1-43c9-40d8-8713-6556921727e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" Apr 20 13:39:22.675562 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.675518 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4d3eef1-43c9-40d8-8713-6556921727e4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5v7p9\" (UID: \"d4d3eef1-43c9-40d8-8713-6556921727e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" Apr 20 13:39:22.675653 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.675629 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppddp\" (UniqueName: \"kubernetes.io/projected/f731df91-8b8d-4fa5-9e6e-8486c58fbcaa-kube-api-access-ppddp\") pod \"limitador-operator-controller-manager-85c4996f8c-fglm2\" (UID: \"f731df91-8b8d-4fa5-9e6e-8486c58fbcaa\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fglm2" Apr 20 13:39:22.675950 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.675936 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4d3eef1-43c9-40d8-8713-6556921727e4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5v7p9\" (UID: \"d4d3eef1-43c9-40d8-8713-6556921727e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" Apr 20 13:39:22.689698 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.689675 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppddp\" (UniqueName: \"kubernetes.io/projected/f731df91-8b8d-4fa5-9e6e-8486c58fbcaa-kube-api-access-ppddp\") pod \"limitador-operator-controller-manager-85c4996f8c-fglm2\" (UID: \"f731df91-8b8d-4fa5-9e6e-8486c58fbcaa\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fglm2" Apr 20 13:39:22.690931 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.690915 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7mhd\" (UniqueName: \"kubernetes.io/projected/d4d3eef1-43c9-40d8-8713-6556921727e4-kube-api-access-f7mhd\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5v7p9\" (UID: \"d4d3eef1-43c9-40d8-8713-6556921727e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" Apr 20 13:39:22.760682 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.760662 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59bd67467f-6rb9f_a40190b1-f8f5-4c0a-9267-a3f911eae204/console/0.log" Apr 20 13:39:22.761040 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.760699 2573 generic.go:358] "Generic (PLEG): container finished" podID="a40190b1-f8f5-4c0a-9267-a3f911eae204" containerID="233389f51b78118cb18b28235b7c292c2a9754956f69ed794da4fdde0203b33d" exitCode=2 Apr 20 13:39:22.761040 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.760794 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59bd67467f-6rb9f" Apr 20 13:39:22.761040 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.760790 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59bd67467f-6rb9f" event={"ID":"a40190b1-f8f5-4c0a-9267-a3f911eae204","Type":"ContainerDied","Data":"233389f51b78118cb18b28235b7c292c2a9754956f69ed794da4fdde0203b33d"} Apr 20 13:39:22.761040 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.760916 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59bd67467f-6rb9f" event={"ID":"a40190b1-f8f5-4c0a-9267-a3f911eae204","Type":"ContainerDied","Data":"d6af28a5ba0efd100783ead1e815ee418e3a2e1ca3fd32dc3ee94a44373ac5a9"} Apr 20 13:39:22.761040 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.760950 2573 scope.go:117] "RemoveContainer" containerID="233389f51b78118cb18b28235b7c292c2a9754956f69ed794da4fdde0203b33d" Apr 20 13:39:22.762687 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.762368 2573 generic.go:358] "Generic (PLEG): container finished" podID="36680ac1-beab-40cb-a5aa-6a4d594bec24" containerID="2c48f0304891e6ebdda154b1ce1435eb4d6104353da8b16a5495704ad74b3f19" exitCode=0 Apr 20 13:39:22.762687 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.762415 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" Apr 20 13:39:22.763960 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.763937 2573 generic.go:358] "Generic (PLEG): container finished" podID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" containerID="db0744ff7b71564b25830a1dd666ad5d51bc9c3dbd853598767d47edee20a672" exitCode=0 Apr 20 13:39:22.764079 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.763986 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" Apr 20 13:39:22.769611 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.769559 2573 status_manager.go:895] "Failed to get status for pod" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:22.770758 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.770741 2573 scope.go:117] "RemoveContainer" containerID="233389f51b78118cb18b28235b7c292c2a9754956f69ed794da4fdde0203b33d" Apr 20 13:39:22.771002 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:39:22.770984 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"233389f51b78118cb18b28235b7c292c2a9754956f69ed794da4fdde0203b33d\": container with ID starting with 233389f51b78118cb18b28235b7c292c2a9754956f69ed794da4fdde0203b33d not found: ID does not exist" containerID="233389f51b78118cb18b28235b7c292c2a9754956f69ed794da4fdde0203b33d" Apr 20 13:39:22.771070 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.771015 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233389f51b78118cb18b28235b7c292c2a9754956f69ed794da4fdde0203b33d"} err="failed to get container status \"233389f51b78118cb18b28235b7c292c2a9754956f69ed794da4fdde0203b33d\": rpc error: code = NotFound desc = could not find container \"233389f51b78118cb18b28235b7c292c2a9754956f69ed794da4fdde0203b33d\": container with ID starting with 233389f51b78118cb18b28235b7c292c2a9754956f69ed794da4fdde0203b33d not found: ID does not exist" Apr 20 13:39:22.771070 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.771041 2573 scope.go:117] "RemoveContainer" containerID="2c48f0304891e6ebdda154b1ce1435eb4d6104353da8b16a5495704ad74b3f19" Apr 20 13:39:22.771617 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.771597 2573 status_manager.go:895] "Failed to get status for pod" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" err="pods \"limitador-operator-controller-manager-85c4996f8c-42lf8\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:22.773626 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.773606 2573 status_manager.go:895] "Failed to get status for pod" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:22.775690 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.775671 2573 status_manager.go:895] "Failed to get status for pod" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" err="pods \"limitador-operator-controller-manager-85c4996f8c-42lf8\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:22.775954 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.775937 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twk58\" (UniqueName: \"kubernetes.io/projected/36680ac1-beab-40cb-a5aa-6a4d594bec24-kube-api-access-twk58\") pod \"36680ac1-beab-40cb-a5aa-6a4d594bec24\" (UID: \"36680ac1-beab-40cb-a5aa-6a4d594bec24\") " Apr 20 13:39:22.776027 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.775974 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cf65846c-5a0d-4a44-bd90-b55bcfa447ff-extensions-socket-volume\") pod \"cf65846c-5a0d-4a44-bd90-b55bcfa447ff\" (UID: \"cf65846c-5a0d-4a44-bd90-b55bcfa447ff\") " Apr 20 13:39:22.776086 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.776045 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr88r\" (UniqueName: \"kubernetes.io/projected/cf65846c-5a0d-4a44-bd90-b55bcfa447ff-kube-api-access-rr88r\") pod \"cf65846c-5a0d-4a44-bd90-b55bcfa447ff\" (UID: \"cf65846c-5a0d-4a44-bd90-b55bcfa447ff\") " Apr 20 13:39:22.776606 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.776578 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf65846c-5a0d-4a44-bd90-b55bcfa447ff-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "cf65846c-5a0d-4a44-bd90-b55bcfa447ff" (UID: "cf65846c-5a0d-4a44-bd90-b55bcfa447ff"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:39:22.777854 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.777831 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36680ac1-beab-40cb-a5aa-6a4d594bec24-kube-api-access-twk58" (OuterVolumeSpecName: "kube-api-access-twk58") pod "36680ac1-beab-40cb-a5aa-6a4d594bec24" (UID: "36680ac1-beab-40cb-a5aa-6a4d594bec24"). InnerVolumeSpecName "kube-api-access-twk58". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:39:22.777965 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.777941 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf65846c-5a0d-4a44-bd90-b55bcfa447ff-kube-api-access-rr88r" (OuterVolumeSpecName: "kube-api-access-rr88r") pod "cf65846c-5a0d-4a44-bd90-b55bcfa447ff" (UID: "cf65846c-5a0d-4a44-bd90-b55bcfa447ff"). InnerVolumeSpecName "kube-api-access-rr88r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:39:22.779746 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.779730 2573 scope.go:117] "RemoveContainer" containerID="2c48f0304891e6ebdda154b1ce1435eb4d6104353da8b16a5495704ad74b3f19" Apr 20 13:39:22.779981 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:39:22.779964 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c48f0304891e6ebdda154b1ce1435eb4d6104353da8b16a5495704ad74b3f19\": container with ID starting with 2c48f0304891e6ebdda154b1ce1435eb4d6104353da8b16a5495704ad74b3f19 not found: ID does not exist" containerID="2c48f0304891e6ebdda154b1ce1435eb4d6104353da8b16a5495704ad74b3f19" Apr 20 13:39:22.780048 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.779992 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c48f0304891e6ebdda154b1ce1435eb4d6104353da8b16a5495704ad74b3f19"} err="failed to get container status \"2c48f0304891e6ebdda154b1ce1435eb4d6104353da8b16a5495704ad74b3f19\": rpc error: code = NotFound desc = could not find container \"2c48f0304891e6ebdda154b1ce1435eb4d6104353da8b16a5495704ad74b3f19\": container with ID starting with 2c48f0304891e6ebdda154b1ce1435eb4d6104353da8b16a5495704ad74b3f19 not found: ID does not exist" Apr 20 13:39:22.780048 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.780017 2573 scope.go:117] "RemoveContainer" containerID="db0744ff7b71564b25830a1dd666ad5d51bc9c3dbd853598767d47edee20a672" Apr 20 13:39:22.789671 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.789653 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59bd67467f-6rb9f"] Apr 20 13:39:22.793698 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.793683 2573 scope.go:117] "RemoveContainer" containerID="db0744ff7b71564b25830a1dd666ad5d51bc9c3dbd853598767d47edee20a672" Apr 20 13:39:22.793940 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:39:22.793920 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db0744ff7b71564b25830a1dd666ad5d51bc9c3dbd853598767d47edee20a672\": container with ID starting with db0744ff7b71564b25830a1dd666ad5d51bc9c3dbd853598767d47edee20a672 not found: ID does not exist" containerID="db0744ff7b71564b25830a1dd666ad5d51bc9c3dbd853598767d47edee20a672" Apr 20 13:39:22.794001 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.793950 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db0744ff7b71564b25830a1dd666ad5d51bc9c3dbd853598767d47edee20a672"} err="failed to get container status \"db0744ff7b71564b25830a1dd666ad5d51bc9c3dbd853598767d47edee20a672\": rpc error: code = NotFound desc = could not find container \"db0744ff7b71564b25830a1dd666ad5d51bc9c3dbd853598767d47edee20a672\": container with ID starting with db0744ff7b71564b25830a1dd666ad5d51bc9c3dbd853598767d47edee20a672 not found: ID does not exist" Apr 20 13:39:22.798725 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.798704 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59bd67467f-6rb9f"] Apr 20 13:39:22.798803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.798776 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" Apr 20 13:39:22.805917 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.805896 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fglm2" Apr 20 13:39:22.877575 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.877547 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-twk58\" (UniqueName: \"kubernetes.io/projected/36680ac1-beab-40cb-a5aa-6a4d594bec24-kube-api-access-twk58\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:39:22.877575 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.877574 2573 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cf65846c-5a0d-4a44-bd90-b55bcfa447ff-extensions-socket-volume\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:39:22.877742 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.877585 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rr88r\" (UniqueName: \"kubernetes.io/projected/cf65846c-5a0d-4a44-bd90-b55bcfa447ff-kube-api-access-rr88r\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:39:22.935110 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.935079 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9"] Apr 20 13:39:22.936848 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:39:22.936823 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4d3eef1_43c9_40d8_8713_6556921727e4.slice/crio-9c23b1a041eeca9bf7f08547f05a84202e24882567048df50abb131638ce1d48 WatchSource:0}: Error finding container 9c23b1a041eeca9bf7f08547f05a84202e24882567048df50abb131638ce1d48: Status 404 returned error can't find the container with id 9c23b1a041eeca9bf7f08547f05a84202e24882567048df50abb131638ce1d48 Apr 20 13:39:22.956893 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:22.956865 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fglm2"] Apr 20 13:39:22.957759 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:39:22.957721 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf731df91_8b8d_4fa5_9e6e_8486c58fbcaa.slice/crio-6f62e538bc7c1495aa5ba32d881fc9721becbd4a7973e135bb7f87f8ae08d12e WatchSource:0}: Error finding container 6f62e538bc7c1495aa5ba32d881fc9721becbd4a7973e135bb7f87f8ae08d12e: Status 404 returned error can't find the container with id 6f62e538bc7c1495aa5ba32d881fc9721becbd4a7973e135bb7f87f8ae08d12e Apr 20 13:39:23.073217 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.073182 2573 status_manager.go:895] "Failed to get status for pod" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:23.075062 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.075039 2573 status_manager.go:895] "Failed to get status for pod" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" err="pods \"limitador-operator-controller-manager-85c4996f8c-42lf8\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:23.076756 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.076733 2573 status_manager.go:895] "Failed to get status for pod" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:23.078446 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.078424 2573 status_manager.go:895] "Failed to get status for pod" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" err="pods \"limitador-operator-controller-manager-85c4996f8c-42lf8\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:23.463229 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.463194 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn"] Apr 20 13:39:23.466560 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.466543 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" Apr 20 13:39:23.468884 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.468862 2573 status_manager.go:895] "Failed to get status for pod" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:23.470918 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.470898 2573 status_manager.go:895] "Failed to get status for pod" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" err="pods \"limitador-operator-controller-manager-85c4996f8c-42lf8\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:23.486784 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.486761 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn"] Apr 20 13:39:23.582107 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.582081 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz7c7\" (UniqueName: \"kubernetes.io/projected/0dd8f1c0-f2de-4064-be40-0f7f772b58e4-kube-api-access-hz7c7\") pod \"kuadrant-operator-controller-manager-55c7f4c975-p2gpn\" (UID: \"0dd8f1c0-f2de-4064-be40-0f7f772b58e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" Apr 20 13:39:23.582264 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.582128 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0dd8f1c0-f2de-4064-be40-0f7f772b58e4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-p2gpn\" (UID: \"0dd8f1c0-f2de-4064-be40-0f7f772b58e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" Apr 20 13:39:23.683451 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.683405 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz7c7\" (UniqueName: \"kubernetes.io/projected/0dd8f1c0-f2de-4064-be40-0f7f772b58e4-kube-api-access-hz7c7\") pod \"kuadrant-operator-controller-manager-55c7f4c975-p2gpn\" (UID: \"0dd8f1c0-f2de-4064-be40-0f7f772b58e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" Apr 20 13:39:23.683617 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.683468 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0dd8f1c0-f2de-4064-be40-0f7f772b58e4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-p2gpn\" (UID: \"0dd8f1c0-f2de-4064-be40-0f7f772b58e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" Apr 20 13:39:23.683813 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.683795 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0dd8f1c0-f2de-4064-be40-0f7f772b58e4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-p2gpn\" (UID: \"0dd8f1c0-f2de-4064-be40-0f7f772b58e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" Apr 20 13:39:23.692334 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.692308 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz7c7\" (UniqueName: \"kubernetes.io/projected/0dd8f1c0-f2de-4064-be40-0f7f772b58e4-kube-api-access-hz7c7\") pod \"kuadrant-operator-controller-manager-55c7f4c975-p2gpn\" (UID: \"0dd8f1c0-f2de-4064-be40-0f7f772b58e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" Apr 20 13:39:23.773373 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.773293 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" event={"ID":"d4d3eef1-43c9-40d8-8713-6556921727e4","Type":"ContainerStarted","Data":"f2cf0c98a551367713a04604378067b3c2df38c2c079fdc41435411df3df0ca6"} Apr 20 13:39:23.773373 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.773325 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" event={"ID":"d4d3eef1-43c9-40d8-8713-6556921727e4","Type":"ContainerStarted","Data":"9c23b1a041eeca9bf7f08547f05a84202e24882567048df50abb131638ce1d48"} Apr 20 13:39:23.773373 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.773361 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" Apr 20 13:39:23.774598 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.774574 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fglm2" event={"ID":"f731df91-8b8d-4fa5-9e6e-8486c58fbcaa","Type":"ContainerStarted","Data":"703b061d5d5d1df20d7beed2dae05eb5dbbec5a4c867f477924ff01348361c19"} Apr 20 13:39:23.774701 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.774601 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fglm2" event={"ID":"f731df91-8b8d-4fa5-9e6e-8486c58fbcaa","Type":"ContainerStarted","Data":"6f62e538bc7c1495aa5ba32d881fc9721becbd4a7973e135bb7f87f8ae08d12e"} Apr 20 13:39:23.774701 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.774665 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fglm2" Apr 20 13:39:23.775740 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.775719 2573 status_manager.go:895] "Failed to get status for pod" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" err="pods \"limitador-operator-controller-manager-85c4996f8c-42lf8\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:23.775860 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.775848 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" Apr 20 13:39:23.796518 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.796483 2573 status_manager.go:895] "Failed to get status for pod" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:23.798813 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.798785 2573 status_manager.go:895] "Failed to get status for pod" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:23.798944 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.798860 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" podStartSLOduration=1.7988485889999999 podStartE2EDuration="1.798848589s" podCreationTimestamp="2026-04-20 13:39:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:39:23.794227388 +0000 UTC m=+520.389918469" watchObservedRunningTime="2026-04-20 13:39:23.798848589 +0000 UTC m=+520.394539665" Apr 20 13:39:23.800718 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.800695 2573 status_manager.go:895] "Failed to get status for pod" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" err="pods \"limitador-operator-controller-manager-85c4996f8c-42lf8\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:23.827890 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.827823 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fglm2" podStartSLOduration=1.827803411 podStartE2EDuration="1.827803411s" podCreationTimestamp="2026-04-20 13:39:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:39:23.824838568 +0000 UTC m=+520.420529645" watchObservedRunningTime="2026-04-20 13:39:23.827803411 +0000 UTC m=+520.423494489" Apr 20 13:39:23.902688 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.902664 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn"] Apr 20 13:39:23.904874 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:39:23.904843 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dd8f1c0_f2de_4064_be40_0f7f772b58e4.slice/crio-b84ef8bc51eafcb7f5fb0b30627d1e9af5d756465fb59ea407f3ec4a2999e92c WatchSource:0}: Error finding container b84ef8bc51eafcb7f5fb0b30627d1e9af5d756465fb59ea407f3ec4a2999e92c: Status 404 returned error can't find the container with id b84ef8bc51eafcb7f5fb0b30627d1e9af5d756465fb59ea407f3ec4a2999e92c Apr 20 13:39:23.984644 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.984601 2573 status_manager.go:895] "Failed to get status for pod" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-29xwp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-29xwp\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:23.985468 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.985447 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" path="/var/lib/kubelet/pods/36680ac1-beab-40cb-a5aa-6a4d594bec24/volumes" Apr 20 13:39:23.985805 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.985793 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a40190b1-f8f5-4c0a-9267-a3f911eae204" path="/var/lib/kubelet/pods/a40190b1-f8f5-4c0a-9267-a3f911eae204/volumes" Apr 20 13:39:23.986102 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.986091 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf65846c-5a0d-4a44-bd90-b55bcfa447ff" path="/var/lib/kubelet/pods/cf65846c-5a0d-4a44-bd90-b55bcfa447ff/volumes" Apr 20 13:39:23.986930 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:23.986905 2573 status_manager.go:895] "Failed to get status for pod" podUID="36680ac1-beab-40cb-a5aa-6a4d594bec24" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-42lf8" err="pods \"limitador-operator-controller-manager-85c4996f8c-42lf8\" is forbidden: User \"system:node:ip-10-0-142-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-144.ec2.internal' and this object" Apr 20 13:39:24.780811 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:24.780776 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" event={"ID":"0dd8f1c0-f2de-4064-be40-0f7f772b58e4","Type":"ContainerStarted","Data":"8de79ac9abc9e584960975da2f38169651177abff14269476eeba5b5c487c1ab"} Apr 20 13:39:24.780811 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:24.780814 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" event={"ID":"0dd8f1c0-f2de-4064-be40-0f7f772b58e4","Type":"ContainerStarted","Data":"b84ef8bc51eafcb7f5fb0b30627d1e9af5d756465fb59ea407f3ec4a2999e92c"} Apr 20 13:39:24.781348 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:24.781172 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" Apr 20 13:39:24.801937 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:24.801890 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" podStartSLOduration=1.801873474 podStartE2EDuration="1.801873474s" podCreationTimestamp="2026-04-20 13:39:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:39:24.800300722 +0000 UTC m=+521.395991798" watchObservedRunningTime="2026-04-20 13:39:24.801873474 +0000 UTC m=+521.397564551" Apr 20 13:39:34.782816 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:34.782734 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-fglm2" Apr 20 13:39:34.782816 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:34.782787 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" Apr 20 13:39:35.787086 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:35.787056 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" Apr 20 13:39:35.842913 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:35.842887 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9"] Apr 20 13:39:35.843125 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:35.843100 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" podUID="d4d3eef1-43c9-40d8-8713-6556921727e4" containerName="manager" containerID="cri-o://f2cf0c98a551367713a04604378067b3c2df38c2c079fdc41435411df3df0ca6" gracePeriod=10 Apr 20 13:39:36.086485 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.086464 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" Apr 20 13:39:36.190714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.190680 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7mhd\" (UniqueName: \"kubernetes.io/projected/d4d3eef1-43c9-40d8-8713-6556921727e4-kube-api-access-f7mhd\") pod \"d4d3eef1-43c9-40d8-8713-6556921727e4\" (UID: \"d4d3eef1-43c9-40d8-8713-6556921727e4\") " Apr 20 13:39:36.190884 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.190736 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4d3eef1-43c9-40d8-8713-6556921727e4-extensions-socket-volume\") pod \"d4d3eef1-43c9-40d8-8713-6556921727e4\" (UID: \"d4d3eef1-43c9-40d8-8713-6556921727e4\") " Apr 20 13:39:36.191129 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.191104 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4d3eef1-43c9-40d8-8713-6556921727e4-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "d4d3eef1-43c9-40d8-8713-6556921727e4" (UID: "d4d3eef1-43c9-40d8-8713-6556921727e4"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:39:36.192808 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.192789 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d3eef1-43c9-40d8-8713-6556921727e4-kube-api-access-f7mhd" (OuterVolumeSpecName: "kube-api-access-f7mhd") pod "d4d3eef1-43c9-40d8-8713-6556921727e4" (UID: "d4d3eef1-43c9-40d8-8713-6556921727e4"). InnerVolumeSpecName "kube-api-access-f7mhd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:39:36.292015 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.291979 2573 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4d3eef1-43c9-40d8-8713-6556921727e4-extensions-socket-volume\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:39:36.292015 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.292006 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f7mhd\" (UniqueName: \"kubernetes.io/projected/d4d3eef1-43c9-40d8-8713-6556921727e4-kube-api-access-f7mhd\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:39:36.829355 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.829322 2573 generic.go:358] "Generic (PLEG): container finished" podID="d4d3eef1-43c9-40d8-8713-6556921727e4" containerID="f2cf0c98a551367713a04604378067b3c2df38c2c079fdc41435411df3df0ca6" exitCode=0 Apr 20 13:39:36.829355 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.829359 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" event={"ID":"d4d3eef1-43c9-40d8-8713-6556921727e4","Type":"ContainerDied","Data":"f2cf0c98a551367713a04604378067b3c2df38c2c079fdc41435411df3df0ca6"} Apr 20 13:39:36.829816 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.829380 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" event={"ID":"d4d3eef1-43c9-40d8-8713-6556921727e4","Type":"ContainerDied","Data":"9c23b1a041eeca9bf7f08547f05a84202e24882567048df50abb131638ce1d48"} Apr 20 13:39:36.829816 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.829383 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9" Apr 20 13:39:36.829816 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.829394 2573 scope.go:117] "RemoveContainer" containerID="f2cf0c98a551367713a04604378067b3c2df38c2c079fdc41435411df3df0ca6" Apr 20 13:39:36.838192 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.838174 2573 scope.go:117] "RemoveContainer" containerID="f2cf0c98a551367713a04604378067b3c2df38c2c079fdc41435411df3df0ca6" Apr 20 13:39:36.838396 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:39:36.838376 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2cf0c98a551367713a04604378067b3c2df38c2c079fdc41435411df3df0ca6\": container with ID starting with f2cf0c98a551367713a04604378067b3c2df38c2c079fdc41435411df3df0ca6 not found: ID does not exist" containerID="f2cf0c98a551367713a04604378067b3c2df38c2c079fdc41435411df3df0ca6" Apr 20 13:39:36.838478 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.838409 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2cf0c98a551367713a04604378067b3c2df38c2c079fdc41435411df3df0ca6"} err="failed to get container status \"f2cf0c98a551367713a04604378067b3c2df38c2c079fdc41435411df3df0ca6\": rpc error: code = NotFound desc = could not find container \"f2cf0c98a551367713a04604378067b3c2df38c2c079fdc41435411df3df0ca6\": container with ID starting with f2cf0c98a551367713a04604378067b3c2df38c2c079fdc41435411df3df0ca6 not found: ID does not exist" Apr 20 13:39:36.855379 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.855352 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9"] Apr 20 13:39:36.858875 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:36.858852 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5v7p9"] Apr 20 13:39:37.980110 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:37.980069 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d3eef1-43c9-40d8-8713-6556921727e4" path="/var/lib/kubelet/pods/d4d3eef1-43c9-40d8-8713-6556921727e4/volumes" Apr 20 13:39:57.366775 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.366744 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4wdx7"] Apr 20 13:39:57.367312 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.367215 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4d3eef1-43c9-40d8-8713-6556921727e4" containerName="manager" Apr 20 13:39:57.367312 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.367234 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d3eef1-43c9-40d8-8713-6556921727e4" containerName="manager" Apr 20 13:39:57.367312 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.367301 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4d3eef1-43c9-40d8-8713-6556921727e4" containerName="manager" Apr 20 13:39:57.371978 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.371960 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4wdx7" Apr 20 13:39:57.374291 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.374270 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-rl97x\"" Apr 20 13:39:57.379006 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.378980 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4wdx7"] Apr 20 13:39:57.460937 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.460904 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mnwk\" (UniqueName: \"kubernetes.io/projected/001e2301-0efe-4261-afc7-82320d4067e3-kube-api-access-8mnwk\") pod \"authorino-f99f4b5cd-4wdx7\" (UID: \"001e2301-0efe-4261-afc7-82320d4067e3\") " pod="kuadrant-system/authorino-f99f4b5cd-4wdx7" Apr 20 13:39:57.546975 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.546921 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-6cd54"] Apr 20 13:39:57.550225 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.550209 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6cd54" Apr 20 13:39:57.556571 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.556549 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-6cd54"] Apr 20 13:39:57.562327 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.562300 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mnwk\" (UniqueName: \"kubernetes.io/projected/001e2301-0efe-4261-afc7-82320d4067e3-kube-api-access-8mnwk\") pod \"authorino-f99f4b5cd-4wdx7\" (UID: \"001e2301-0efe-4261-afc7-82320d4067e3\") " pod="kuadrant-system/authorino-f99f4b5cd-4wdx7" Apr 20 13:39:57.570624 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.570604 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mnwk\" (UniqueName: \"kubernetes.io/projected/001e2301-0efe-4261-afc7-82320d4067e3-kube-api-access-8mnwk\") pod \"authorino-f99f4b5cd-4wdx7\" (UID: \"001e2301-0efe-4261-afc7-82320d4067e3\") " pod="kuadrant-system/authorino-f99f4b5cd-4wdx7" Apr 20 13:39:57.663764 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.663688 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2nmx\" (UniqueName: \"kubernetes.io/projected/209fd00a-d607-4d6a-8b53-b5cde57de529-kube-api-access-h2nmx\") pod \"authorino-7498df8756-6cd54\" (UID: \"209fd00a-d607-4d6a-8b53-b5cde57de529\") " pod="kuadrant-system/authorino-7498df8756-6cd54" Apr 20 13:39:57.682120 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.682101 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4wdx7" Apr 20 13:39:57.764798 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.764765 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2nmx\" (UniqueName: \"kubernetes.io/projected/209fd00a-d607-4d6a-8b53-b5cde57de529-kube-api-access-h2nmx\") pod \"authorino-7498df8756-6cd54\" (UID: \"209fd00a-d607-4d6a-8b53-b5cde57de529\") " pod="kuadrant-system/authorino-7498df8756-6cd54" Apr 20 13:39:57.773921 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.773892 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2nmx\" (UniqueName: \"kubernetes.io/projected/209fd00a-d607-4d6a-8b53-b5cde57de529-kube-api-access-h2nmx\") pod \"authorino-7498df8756-6cd54\" (UID: \"209fd00a-d607-4d6a-8b53-b5cde57de529\") " pod="kuadrant-system/authorino-7498df8756-6cd54" Apr 20 13:39:57.805821 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.805792 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4wdx7"] Apr 20 13:39:57.808848 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:39:57.808820 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001e2301_0efe_4261_afc7_82320d4067e3.slice/crio-bc8465068cce8b8efccfb72466733d48a867bb6f68c41a00b27178aee725c8dc WatchSource:0}: Error finding container bc8465068cce8b8efccfb72466733d48a867bb6f68c41a00b27178aee725c8dc: Status 404 returned error can't find the container with id bc8465068cce8b8efccfb72466733d48a867bb6f68c41a00b27178aee725c8dc Apr 20 13:39:57.859835 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.859804 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6cd54" Apr 20 13:39:57.911411 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.911367 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4wdx7" event={"ID":"001e2301-0efe-4261-afc7-82320d4067e3","Type":"ContainerStarted","Data":"bc8465068cce8b8efccfb72466733d48a867bb6f68c41a00b27178aee725c8dc"} Apr 20 13:39:57.981481 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:39:57.981435 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod209fd00a_d607_4d6a_8b53_b5cde57de529.slice/crio-10e7cb3032127cab83ad569f66e3c00a9beeff65eec27972ef6d8c1c9750d45a WatchSource:0}: Error finding container 10e7cb3032127cab83ad569f66e3c00a9beeff65eec27972ef6d8c1c9750d45a: Status 404 returned error can't find the container with id 10e7cb3032127cab83ad569f66e3c00a9beeff65eec27972ef6d8c1c9750d45a Apr 20 13:39:57.983217 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:57.983198 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-6cd54"] Apr 20 13:39:58.919554 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:39:58.919484 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6cd54" event={"ID":"209fd00a-d607-4d6a-8b53-b5cde57de529","Type":"ContainerStarted","Data":"10e7cb3032127cab83ad569f66e3c00a9beeff65eec27972ef6d8c1c9750d45a"} Apr 20 13:40:00.931767 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:00.931731 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6cd54" event={"ID":"209fd00a-d607-4d6a-8b53-b5cde57de529","Type":"ContainerStarted","Data":"67db277f6f24c5e572d376724cb3403cb44d6204dc42270b31ac6247dfdf643d"} Apr 20 13:40:00.933049 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:00.933023 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4wdx7" event={"ID":"001e2301-0efe-4261-afc7-82320d4067e3","Type":"ContainerStarted","Data":"537d6c88bfea044c5f280125064d775a9e6a511f70a6d494c58b220a086084a6"} Apr 20 13:40:00.955179 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:00.955112 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-6cd54" podStartSLOduration=1.625716994 podStartE2EDuration="3.955094845s" podCreationTimestamp="2026-04-20 13:39:57 +0000 UTC" firstStartedPulling="2026-04-20 13:39:57.982738462 +0000 UTC m=+554.578429515" lastFinishedPulling="2026-04-20 13:40:00.312116312 +0000 UTC m=+556.907807366" observedRunningTime="2026-04-20 13:40:00.954449402 +0000 UTC m=+557.550140478" watchObservedRunningTime="2026-04-20 13:40:00.955094845 +0000 UTC m=+557.550785922" Apr 20 13:40:00.972674 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:00.972622 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-4wdx7" podStartSLOduration=1.459280697 podStartE2EDuration="3.972609853s" podCreationTimestamp="2026-04-20 13:39:57 +0000 UTC" firstStartedPulling="2026-04-20 13:39:57.810071171 +0000 UTC m=+554.405762224" lastFinishedPulling="2026-04-20 13:40:00.323400309 +0000 UTC m=+556.919091380" observedRunningTime="2026-04-20 13:40:00.971981983 +0000 UTC m=+557.567673060" watchObservedRunningTime="2026-04-20 13:40:00.972609853 +0000 UTC m=+557.568300928" Apr 20 13:40:01.004618 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:01.004590 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4wdx7"] Apr 20 13:40:02.940713 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:02.940673 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-4wdx7" podUID="001e2301-0efe-4261-afc7-82320d4067e3" containerName="authorino" containerID="cri-o://537d6c88bfea044c5f280125064d775a9e6a511f70a6d494c58b220a086084a6" gracePeriod=30 Apr 20 13:40:03.195918 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:03.195853 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4wdx7" Apr 20 13:40:03.311205 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:03.311166 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mnwk\" (UniqueName: \"kubernetes.io/projected/001e2301-0efe-4261-afc7-82320d4067e3-kube-api-access-8mnwk\") pod \"001e2301-0efe-4261-afc7-82320d4067e3\" (UID: \"001e2301-0efe-4261-afc7-82320d4067e3\") " Apr 20 13:40:03.313281 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:03.313253 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/001e2301-0efe-4261-afc7-82320d4067e3-kube-api-access-8mnwk" (OuterVolumeSpecName: "kube-api-access-8mnwk") pod "001e2301-0efe-4261-afc7-82320d4067e3" (UID: "001e2301-0efe-4261-afc7-82320d4067e3"). InnerVolumeSpecName "kube-api-access-8mnwk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:40:03.412595 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:03.412554 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8mnwk\" (UniqueName: \"kubernetes.io/projected/001e2301-0efe-4261-afc7-82320d4067e3-kube-api-access-8mnwk\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:40:03.945187 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:03.945117 2573 generic.go:358] "Generic (PLEG): container finished" podID="001e2301-0efe-4261-afc7-82320d4067e3" containerID="537d6c88bfea044c5f280125064d775a9e6a511f70a6d494c58b220a086084a6" exitCode=0 Apr 20 13:40:03.945621 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:03.945207 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4wdx7" Apr 20 13:40:03.945621 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:03.945214 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4wdx7" event={"ID":"001e2301-0efe-4261-afc7-82320d4067e3","Type":"ContainerDied","Data":"537d6c88bfea044c5f280125064d775a9e6a511f70a6d494c58b220a086084a6"} Apr 20 13:40:03.945621 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:03.945258 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4wdx7" event={"ID":"001e2301-0efe-4261-afc7-82320d4067e3","Type":"ContainerDied","Data":"bc8465068cce8b8efccfb72466733d48a867bb6f68c41a00b27178aee725c8dc"} Apr 20 13:40:03.945621 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:03.945274 2573 scope.go:117] "RemoveContainer" containerID="537d6c88bfea044c5f280125064d775a9e6a511f70a6d494c58b220a086084a6" Apr 20 13:40:03.953933 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:03.953913 2573 scope.go:117] "RemoveContainer" containerID="537d6c88bfea044c5f280125064d775a9e6a511f70a6d494c58b220a086084a6" Apr 20 13:40:03.954202 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:40:03.954181 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"537d6c88bfea044c5f280125064d775a9e6a511f70a6d494c58b220a086084a6\": container with ID starting with 537d6c88bfea044c5f280125064d775a9e6a511f70a6d494c58b220a086084a6 not found: ID does not exist" containerID="537d6c88bfea044c5f280125064d775a9e6a511f70a6d494c58b220a086084a6" Apr 20 13:40:03.954268 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:03.954209 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"537d6c88bfea044c5f280125064d775a9e6a511f70a6d494c58b220a086084a6"} err="failed to get container status \"537d6c88bfea044c5f280125064d775a9e6a511f70a6d494c58b220a086084a6\": rpc error: code = NotFound desc = could not find container \"537d6c88bfea044c5f280125064d775a9e6a511f70a6d494c58b220a086084a6\": container with ID starting with 537d6c88bfea044c5f280125064d775a9e6a511f70a6d494c58b220a086084a6 not found: ID does not exist" Apr 20 13:40:03.965998 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:03.965971 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4wdx7"] Apr 20 13:40:03.970610 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:03.970589 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4wdx7"] Apr 20 13:40:03.979704 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:03.979681 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="001e2301-0efe-4261-afc7-82320d4067e3" path="/var/lib/kubelet/pods/001e2301-0efe-4261-afc7-82320d4067e3/volumes" Apr 20 13:40:30.532918 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.532882 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-hpbzj"] Apr 20 13:40:30.533385 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.533220 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="001e2301-0efe-4261-afc7-82320d4067e3" containerName="authorino" Apr 20 13:40:30.533385 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.533233 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="001e2301-0efe-4261-afc7-82320d4067e3" containerName="authorino" Apr 20 13:40:30.533385 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.533295 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="001e2301-0efe-4261-afc7-82320d4067e3" containerName="authorino" Apr 20 13:40:30.536535 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.536505 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-hpbzj" Apr 20 13:40:30.545679 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.545650 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-hpbzj"] Apr 20 13:40:30.648385 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.648352 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jpj7\" (UniqueName: \"kubernetes.io/projected/e870296d-ec42-47ab-80bf-450dfe418bc3-kube-api-access-6jpj7\") pod \"authorino-8b475cf9f-hpbzj\" (UID: \"e870296d-ec42-47ab-80bf-450dfe418bc3\") " pod="kuadrant-system/authorino-8b475cf9f-hpbzj" Apr 20 13:40:30.749435 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.749400 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jpj7\" (UniqueName: \"kubernetes.io/projected/e870296d-ec42-47ab-80bf-450dfe418bc3-kube-api-access-6jpj7\") pod \"authorino-8b475cf9f-hpbzj\" (UID: \"e870296d-ec42-47ab-80bf-450dfe418bc3\") " pod="kuadrant-system/authorino-8b475cf9f-hpbzj" Apr 20 13:40:30.762220 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.762189 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jpj7\" (UniqueName: \"kubernetes.io/projected/e870296d-ec42-47ab-80bf-450dfe418bc3-kube-api-access-6jpj7\") pod \"authorino-8b475cf9f-hpbzj\" (UID: \"e870296d-ec42-47ab-80bf-450dfe418bc3\") " pod="kuadrant-system/authorino-8b475cf9f-hpbzj" Apr 20 13:40:30.805573 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.805538 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-hpbzj"] Apr 20 13:40:30.805799 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.805783 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-hpbzj" Apr 20 13:40:30.843324 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.843289 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-685c6c6cf7-b9s6n"] Apr 20 13:40:30.847662 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.847643 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-685c6c6cf7-b9s6n" Apr 20 13:40:30.852525 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.852503 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-685c6c6cf7-b9s6n"] Apr 20 13:40:30.926120 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.926091 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-685c6c6cf7-b9s6n"] Apr 20 13:40:30.926365 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:40:30.926346 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-8lmfj], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-685c6c6cf7-b9s6n" podUID="37a12772-7503-4c3a-a33c-f69439a3b4c1" Apr 20 13:40:30.938697 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.938668 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-hpbzj"] Apr 20 13:40:30.940896 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:40:30.940871 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode870296d_ec42_47ab_80bf_450dfe418bc3.slice/crio-60c504bedd26e0fe8c1e730db3417d93db4f1339c941eb5672991aace94b65cc WatchSource:0}: Error finding container 60c504bedd26e0fe8c1e730db3417d93db4f1339c941eb5672991aace94b65cc: Status 404 returned error can't find the container with id 60c504bedd26e0fe8c1e730db3417d93db4f1339c941eb5672991aace94b65cc Apr 20 13:40:30.951343 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.951318 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lmfj\" (UniqueName: \"kubernetes.io/projected/37a12772-7503-4c3a-a33c-f69439a3b4c1-kube-api-access-8lmfj\") pod \"authorino-685c6c6cf7-b9s6n\" (UID: \"37a12772-7503-4c3a-a33c-f69439a3b4c1\") " pod="kuadrant-system/authorino-685c6c6cf7-b9s6n" Apr 20 13:40:30.993698 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.993670 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-876f8b889-wnm5f"] Apr 20 13:40:30.997064 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.997049 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-876f8b889-wnm5f" Apr 20 13:40:30.999571 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:30.999550 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 13:40:31.005238 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.005217 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-876f8b889-wnm5f"] Apr 20 13:40:31.044890 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.044859 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-hpbzj" event={"ID":"e870296d-ec42-47ab-80bf-450dfe418bc3","Type":"ContainerStarted","Data":"60c504bedd26e0fe8c1e730db3417d93db4f1339c941eb5672991aace94b65cc"} Apr 20 13:40:31.045023 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.044912 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-685c6c6cf7-b9s6n" Apr 20 13:40:31.049079 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.049060 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-685c6c6cf7-b9s6n" Apr 20 13:40:31.052019 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.052000 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8lmfj\" (UniqueName: \"kubernetes.io/projected/37a12772-7503-4c3a-a33c-f69439a3b4c1-kube-api-access-8lmfj\") pod \"authorino-685c6c6cf7-b9s6n\" (UID: \"37a12772-7503-4c3a-a33c-f69439a3b4c1\") " pod="kuadrant-system/authorino-685c6c6cf7-b9s6n" Apr 20 13:40:31.060079 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.060028 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lmfj\" (UniqueName: \"kubernetes.io/projected/37a12772-7503-4c3a-a33c-f69439a3b4c1-kube-api-access-8lmfj\") pod \"authorino-685c6c6cf7-b9s6n\" (UID: \"37a12772-7503-4c3a-a33c-f69439a3b4c1\") " pod="kuadrant-system/authorino-685c6c6cf7-b9s6n" Apr 20 13:40:31.152676 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.152654 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lmfj\" (UniqueName: \"kubernetes.io/projected/37a12772-7503-4c3a-a33c-f69439a3b4c1-kube-api-access-8lmfj\") pod \"37a12772-7503-4c3a-a33c-f69439a3b4c1\" (UID: \"37a12772-7503-4c3a-a33c-f69439a3b4c1\") " Apr 20 13:40:31.152818 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.152791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdr69\" (UniqueName: \"kubernetes.io/projected/a149d8d3-6df7-4813-995b-6647fced3c56-kube-api-access-qdr69\") pod \"authorino-876f8b889-wnm5f\" (UID: \"a149d8d3-6df7-4813-995b-6647fced3c56\") " pod="kuadrant-system/authorino-876f8b889-wnm5f" Apr 20 13:40:31.152874 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.152860 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a149d8d3-6df7-4813-995b-6647fced3c56-tls-cert\") pod \"authorino-876f8b889-wnm5f\" (UID: \"a149d8d3-6df7-4813-995b-6647fced3c56\") " pod="kuadrant-system/authorino-876f8b889-wnm5f" Apr 20 13:40:31.154444 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.154424 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a12772-7503-4c3a-a33c-f69439a3b4c1-kube-api-access-8lmfj" (OuterVolumeSpecName: "kube-api-access-8lmfj") pod "37a12772-7503-4c3a-a33c-f69439a3b4c1" (UID: "37a12772-7503-4c3a-a33c-f69439a3b4c1"). InnerVolumeSpecName "kube-api-access-8lmfj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:40:31.253967 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.253945 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdr69\" (UniqueName: \"kubernetes.io/projected/a149d8d3-6df7-4813-995b-6647fced3c56-kube-api-access-qdr69\") pod \"authorino-876f8b889-wnm5f\" (UID: \"a149d8d3-6df7-4813-995b-6647fced3c56\") " pod="kuadrant-system/authorino-876f8b889-wnm5f" Apr 20 13:40:31.254064 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.254003 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a149d8d3-6df7-4813-995b-6647fced3c56-tls-cert\") pod \"authorino-876f8b889-wnm5f\" (UID: \"a149d8d3-6df7-4813-995b-6647fced3c56\") " pod="kuadrant-system/authorino-876f8b889-wnm5f" Apr 20 13:40:31.254064 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.254034 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8lmfj\" (UniqueName: \"kubernetes.io/projected/37a12772-7503-4c3a-a33c-f69439a3b4c1-kube-api-access-8lmfj\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:40:31.256221 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.256197 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a149d8d3-6df7-4813-995b-6647fced3c56-tls-cert\") pod \"authorino-876f8b889-wnm5f\" (UID: \"a149d8d3-6df7-4813-995b-6647fced3c56\") " pod="kuadrant-system/authorino-876f8b889-wnm5f" Apr 20 13:40:31.262829 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.262807 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdr69\" (UniqueName: \"kubernetes.io/projected/a149d8d3-6df7-4813-995b-6647fced3c56-kube-api-access-qdr69\") pod \"authorino-876f8b889-wnm5f\" (UID: \"a149d8d3-6df7-4813-995b-6647fced3c56\") " pod="kuadrant-system/authorino-876f8b889-wnm5f" Apr 20 13:40:31.307623 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.307591 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-876f8b889-wnm5f" Apr 20 13:40:31.435622 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:31.435598 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-876f8b889-wnm5f"] Apr 20 13:40:31.437979 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:40:31.437949 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda149d8d3_6df7_4813_995b_6647fced3c56.slice/crio-71094a35fd78d179df7d19afb555d370ee90525fbbb55bfc4ad1f767fc3b3f55 WatchSource:0}: Error finding container 71094a35fd78d179df7d19afb555d370ee90525fbbb55bfc4ad1f767fc3b3f55: Status 404 returned error can't find the container with id 71094a35fd78d179df7d19afb555d370ee90525fbbb55bfc4ad1f767fc3b3f55 Apr 20 13:40:32.050413 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.050374 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-hpbzj" event={"ID":"e870296d-ec42-47ab-80bf-450dfe418bc3","Type":"ContainerStarted","Data":"ebbef861da08971d29352f26599a1629d77dcf89dca428039c3a5e54c29049b6"} Apr 20 13:40:32.050901 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.050469 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-hpbzj" podUID="e870296d-ec42-47ab-80bf-450dfe418bc3" containerName="authorino" containerID="cri-o://ebbef861da08971d29352f26599a1629d77dcf89dca428039c3a5e54c29049b6" gracePeriod=30 Apr 20 13:40:32.051732 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.051707 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-876f8b889-wnm5f" event={"ID":"a149d8d3-6df7-4813-995b-6647fced3c56","Type":"ContainerStarted","Data":"3c7442db031bfe31495384948d004804519622187359228e83424d90c38629bd"} Apr 20 13:40:32.051888 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.051735 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-876f8b889-wnm5f" event={"ID":"a149d8d3-6df7-4813-995b-6647fced3c56","Type":"ContainerStarted","Data":"71094a35fd78d179df7d19afb555d370ee90525fbbb55bfc4ad1f767fc3b3f55"} Apr 20 13:40:32.051888 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.051782 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-685c6c6cf7-b9s6n" Apr 20 13:40:32.066735 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.066689 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-hpbzj" podStartSLOduration=1.792605833 podStartE2EDuration="2.066674897s" podCreationTimestamp="2026-04-20 13:40:30 +0000 UTC" firstStartedPulling="2026-04-20 13:40:30.942061259 +0000 UTC m=+587.537752313" lastFinishedPulling="2026-04-20 13:40:31.216130323 +0000 UTC m=+587.811821377" observedRunningTime="2026-04-20 13:40:32.065531765 +0000 UTC m=+588.661222844" watchObservedRunningTime="2026-04-20 13:40:32.066674897 +0000 UTC m=+588.662365973" Apr 20 13:40:32.082491 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.082442 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-876f8b889-wnm5f" podStartSLOduration=1.753224179 podStartE2EDuration="2.082423617s" podCreationTimestamp="2026-04-20 13:40:30 +0000 UTC" firstStartedPulling="2026-04-20 13:40:31.439332082 +0000 UTC m=+588.035023135" lastFinishedPulling="2026-04-20 13:40:31.768531503 +0000 UTC m=+588.364222573" observedRunningTime="2026-04-20 13:40:32.0804217 +0000 UTC m=+588.676112779" watchObservedRunningTime="2026-04-20 13:40:32.082423617 +0000 UTC m=+588.678114696" Apr 20 13:40:32.107218 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.107186 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-685c6c6cf7-b9s6n"] Apr 20 13:40:32.114474 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.114429 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-685c6c6cf7-b9s6n"] Apr 20 13:40:32.116791 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.116714 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-6cd54"] Apr 20 13:40:32.116961 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.116903 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-6cd54" podUID="209fd00a-d607-4d6a-8b53-b5cde57de529" containerName="authorino" containerID="cri-o://67db277f6f24c5e572d376724cb3403cb44d6204dc42270b31ac6247dfdf643d" gracePeriod=30 Apr 20 13:40:32.371766 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.371743 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-hpbzj" Apr 20 13:40:32.376950 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.376932 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6cd54" Apr 20 13:40:32.571636 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.571537 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jpj7\" (UniqueName: \"kubernetes.io/projected/e870296d-ec42-47ab-80bf-450dfe418bc3-kube-api-access-6jpj7\") pod \"e870296d-ec42-47ab-80bf-450dfe418bc3\" (UID: \"e870296d-ec42-47ab-80bf-450dfe418bc3\") " Apr 20 13:40:32.571822 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.571644 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2nmx\" (UniqueName: \"kubernetes.io/projected/209fd00a-d607-4d6a-8b53-b5cde57de529-kube-api-access-h2nmx\") pod \"209fd00a-d607-4d6a-8b53-b5cde57de529\" (UID: \"209fd00a-d607-4d6a-8b53-b5cde57de529\") " Apr 20 13:40:32.573721 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.573684 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/209fd00a-d607-4d6a-8b53-b5cde57de529-kube-api-access-h2nmx" (OuterVolumeSpecName: "kube-api-access-h2nmx") pod "209fd00a-d607-4d6a-8b53-b5cde57de529" (UID: "209fd00a-d607-4d6a-8b53-b5cde57de529"). InnerVolumeSpecName "kube-api-access-h2nmx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:40:32.573840 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.573728 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e870296d-ec42-47ab-80bf-450dfe418bc3-kube-api-access-6jpj7" (OuterVolumeSpecName: "kube-api-access-6jpj7") pod "e870296d-ec42-47ab-80bf-450dfe418bc3" (UID: "e870296d-ec42-47ab-80bf-450dfe418bc3"). InnerVolumeSpecName "kube-api-access-6jpj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:40:32.672999 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.672960 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6jpj7\" (UniqueName: \"kubernetes.io/projected/e870296d-ec42-47ab-80bf-450dfe418bc3-kube-api-access-6jpj7\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:40:32.672999 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.672991 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h2nmx\" (UniqueName: \"kubernetes.io/projected/209fd00a-d607-4d6a-8b53-b5cde57de529-kube-api-access-h2nmx\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:40:32.962418 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.962334 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-bjvsg"] Apr 20 13:40:32.962672 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.962660 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="209fd00a-d607-4d6a-8b53-b5cde57de529" containerName="authorino" Apr 20 13:40:32.962715 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.962673 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="209fd00a-d607-4d6a-8b53-b5cde57de529" containerName="authorino" Apr 20 13:40:32.962715 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.962682 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e870296d-ec42-47ab-80bf-450dfe418bc3" containerName="authorino" Apr 20 13:40:32.962715 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.962687 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e870296d-ec42-47ab-80bf-450dfe418bc3" containerName="authorino" Apr 20 13:40:32.962817 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.962738 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e870296d-ec42-47ab-80bf-450dfe418bc3" containerName="authorino" Apr 20 13:40:32.962817 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.962750 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="209fd00a-d607-4d6a-8b53-b5cde57de529" containerName="authorino" Apr 20 13:40:32.966400 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.966380 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-bjvsg" Apr 20 13:40:32.968561 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.968540 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-d7t72\"" Apr 20 13:40:32.975791 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.975766 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49lpj\" (UniqueName: \"kubernetes.io/projected/33b1e997-2189-49f6-b728-00622d7ddbc0-kube-api-access-49lpj\") pod \"maas-controller-6d4c8f55f9-bjvsg\" (UID: \"33b1e997-2189-49f6-b728-00622d7ddbc0\") " pod="opendatahub/maas-controller-6d4c8f55f9-bjvsg" Apr 20 13:40:32.980897 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:32.980869 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-bjvsg"] Apr 20 13:40:33.058001 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.057961 2573 generic.go:358] "Generic (PLEG): container finished" podID="209fd00a-d607-4d6a-8b53-b5cde57de529" containerID="67db277f6f24c5e572d376724cb3403cb44d6204dc42270b31ac6247dfdf643d" exitCode=0 Apr 20 13:40:33.058499 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.058017 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6cd54" Apr 20 13:40:33.058499 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.058049 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6cd54" event={"ID":"209fd00a-d607-4d6a-8b53-b5cde57de529","Type":"ContainerDied","Data":"67db277f6f24c5e572d376724cb3403cb44d6204dc42270b31ac6247dfdf643d"} Apr 20 13:40:33.058499 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.058096 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6cd54" event={"ID":"209fd00a-d607-4d6a-8b53-b5cde57de529","Type":"ContainerDied","Data":"10e7cb3032127cab83ad569f66e3c00a9beeff65eec27972ef6d8c1c9750d45a"} Apr 20 13:40:33.058499 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.058117 2573 scope.go:117] "RemoveContainer" containerID="67db277f6f24c5e572d376724cb3403cb44d6204dc42270b31ac6247dfdf643d" Apr 20 13:40:33.059394 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.059373 2573 generic.go:358] "Generic (PLEG): container finished" podID="e870296d-ec42-47ab-80bf-450dfe418bc3" containerID="ebbef861da08971d29352f26599a1629d77dcf89dca428039c3a5e54c29049b6" exitCode=0 Apr 20 13:40:33.059483 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.059417 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-hpbzj" Apr 20 13:40:33.059483 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.059461 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-hpbzj" event={"ID":"e870296d-ec42-47ab-80bf-450dfe418bc3","Type":"ContainerDied","Data":"ebbef861da08971d29352f26599a1629d77dcf89dca428039c3a5e54c29049b6"} Apr 20 13:40:33.059589 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.059485 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-hpbzj" event={"ID":"e870296d-ec42-47ab-80bf-450dfe418bc3","Type":"ContainerDied","Data":"60c504bedd26e0fe8c1e730db3417d93db4f1339c941eb5672991aace94b65cc"} Apr 20 13:40:33.068980 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.068960 2573 scope.go:117] "RemoveContainer" containerID="67db277f6f24c5e572d376724cb3403cb44d6204dc42270b31ac6247dfdf643d" Apr 20 13:40:33.069342 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:40:33.069319 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67db277f6f24c5e572d376724cb3403cb44d6204dc42270b31ac6247dfdf643d\": container with ID starting with 67db277f6f24c5e572d376724cb3403cb44d6204dc42270b31ac6247dfdf643d not found: ID does not exist" containerID="67db277f6f24c5e572d376724cb3403cb44d6204dc42270b31ac6247dfdf643d" Apr 20 13:40:33.069415 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.069354 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67db277f6f24c5e572d376724cb3403cb44d6204dc42270b31ac6247dfdf643d"} err="failed to get container status \"67db277f6f24c5e572d376724cb3403cb44d6204dc42270b31ac6247dfdf643d\": rpc error: code = NotFound desc = could not find container \"67db277f6f24c5e572d376724cb3403cb44d6204dc42270b31ac6247dfdf643d\": container with ID starting with 67db277f6f24c5e572d376724cb3403cb44d6204dc42270b31ac6247dfdf643d not found: ID does not exist" Apr 20 13:40:33.069415 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.069378 2573 scope.go:117] "RemoveContainer" containerID="ebbef861da08971d29352f26599a1629d77dcf89dca428039c3a5e54c29049b6" Apr 20 13:40:33.076289 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.076262 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49lpj\" (UniqueName: \"kubernetes.io/projected/33b1e997-2189-49f6-b728-00622d7ddbc0-kube-api-access-49lpj\") pod \"maas-controller-6d4c8f55f9-bjvsg\" (UID: \"33b1e997-2189-49f6-b728-00622d7ddbc0\") " pod="opendatahub/maas-controller-6d4c8f55f9-bjvsg" Apr 20 13:40:33.077376 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.077362 2573 scope.go:117] "RemoveContainer" containerID="ebbef861da08971d29352f26599a1629d77dcf89dca428039c3a5e54c29049b6" Apr 20 13:40:33.077666 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:40:33.077647 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebbef861da08971d29352f26599a1629d77dcf89dca428039c3a5e54c29049b6\": container with ID starting with ebbef861da08971d29352f26599a1629d77dcf89dca428039c3a5e54c29049b6 not found: ID does not exist" containerID="ebbef861da08971d29352f26599a1629d77dcf89dca428039c3a5e54c29049b6" Apr 20 13:40:33.077731 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.077676 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebbef861da08971d29352f26599a1629d77dcf89dca428039c3a5e54c29049b6"} err="failed to get container status \"ebbef861da08971d29352f26599a1629d77dcf89dca428039c3a5e54c29049b6\": rpc error: code = NotFound desc = could not find container \"ebbef861da08971d29352f26599a1629d77dcf89dca428039c3a5e54c29049b6\": container with ID starting with ebbef861da08971d29352f26599a1629d77dcf89dca428039c3a5e54c29049b6 not found: ID does not exist" Apr 20 13:40:33.086038 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.086011 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-6cd54"] Apr 20 13:40:33.089352 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.089326 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49lpj\" (UniqueName: \"kubernetes.io/projected/33b1e997-2189-49f6-b728-00622d7ddbc0-kube-api-access-49lpj\") pod \"maas-controller-6d4c8f55f9-bjvsg\" (UID: \"33b1e997-2189-49f6-b728-00622d7ddbc0\") " pod="opendatahub/maas-controller-6d4c8f55f9-bjvsg" Apr 20 13:40:33.090054 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.090035 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-6cd54"] Apr 20 13:40:33.101555 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.101526 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-hpbzj"] Apr 20 13:40:33.107537 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.107513 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-hpbzj"] Apr 20 13:40:33.112118 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.112095 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-794b49f56f-6zmv9"] Apr 20 13:40:33.117160 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.117120 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-794b49f56f-6zmv9" Apr 20 13:40:33.123623 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.123590 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-794b49f56f-6zmv9"] Apr 20 13:40:33.176997 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.176953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntfls\" (UniqueName: \"kubernetes.io/projected/dbb1277c-2495-4cb6-a7a2-3daef6fa7853-kube-api-access-ntfls\") pod \"maas-controller-794b49f56f-6zmv9\" (UID: \"dbb1277c-2495-4cb6-a7a2-3daef6fa7853\") " pod="opendatahub/maas-controller-794b49f56f-6zmv9" Apr 20 13:40:33.241101 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.240989 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-bjvsg"] Apr 20 13:40:33.241535 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.241513 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-bjvsg" Apr 20 13:40:33.277458 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.277418 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntfls\" (UniqueName: \"kubernetes.io/projected/dbb1277c-2495-4cb6-a7a2-3daef6fa7853-kube-api-access-ntfls\") pod \"maas-controller-794b49f56f-6zmv9\" (UID: \"dbb1277c-2495-4cb6-a7a2-3daef6fa7853\") " pod="opendatahub/maas-controller-794b49f56f-6zmv9" Apr 20 13:40:33.285913 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.285875 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntfls\" (UniqueName: \"kubernetes.io/projected/dbb1277c-2495-4cb6-a7a2-3daef6fa7853-kube-api-access-ntfls\") pod \"maas-controller-794b49f56f-6zmv9\" (UID: \"dbb1277c-2495-4cb6-a7a2-3daef6fa7853\") " pod="opendatahub/maas-controller-794b49f56f-6zmv9" Apr 20 13:40:33.395726 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.395686 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-bjvsg"] Apr 20 13:40:33.397881 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:40:33.397853 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33b1e997_2189_49f6_b728_00622d7ddbc0.slice/crio-63cdede69257a8bdd9f42191ddcd32f708ba9762f3c7c514cfbeab20ed0d2dbc WatchSource:0}: Error finding container 63cdede69257a8bdd9f42191ddcd32f708ba9762f3c7c514cfbeab20ed0d2dbc: Status 404 returned error can't find the container with id 63cdede69257a8bdd9f42191ddcd32f708ba9762f3c7c514cfbeab20ed0d2dbc Apr 20 13:40:33.429559 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.429524 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-794b49f56f-6zmv9" Apr 20 13:40:33.561834 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.561809 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-794b49f56f-6zmv9"] Apr 20 13:40:33.563287 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:40:33.563262 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbb1277c_2495_4cb6_a7a2_3daef6fa7853.slice/crio-4df514af281511236c1b18991397496f5c9aa3440d2670b130e92b5c11dd8c25 WatchSource:0}: Error finding container 4df514af281511236c1b18991397496f5c9aa3440d2670b130e92b5c11dd8c25: Status 404 returned error can't find the container with id 4df514af281511236c1b18991397496f5c9aa3440d2670b130e92b5c11dd8c25 Apr 20 13:40:33.982665 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.982625 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="209fd00a-d607-4d6a-8b53-b5cde57de529" path="/var/lib/kubelet/pods/209fd00a-d607-4d6a-8b53-b5cde57de529/volumes" Apr 20 13:40:33.983081 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.983065 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a12772-7503-4c3a-a33c-f69439a3b4c1" path="/var/lib/kubelet/pods/37a12772-7503-4c3a-a33c-f69439a3b4c1/volumes" Apr 20 13:40:33.983366 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:33.983349 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e870296d-ec42-47ab-80bf-450dfe418bc3" path="/var/lib/kubelet/pods/e870296d-ec42-47ab-80bf-450dfe418bc3/volumes" Apr 20 13:40:34.065423 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:34.065383 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-794b49f56f-6zmv9" event={"ID":"dbb1277c-2495-4cb6-a7a2-3daef6fa7853","Type":"ContainerStarted","Data":"4df514af281511236c1b18991397496f5c9aa3440d2670b130e92b5c11dd8c25"} Apr 20 13:40:34.066858 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:34.066821 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-bjvsg" event={"ID":"33b1e997-2189-49f6-b728-00622d7ddbc0","Type":"ContainerStarted","Data":"63cdede69257a8bdd9f42191ddcd32f708ba9762f3c7c514cfbeab20ed0d2dbc"} Apr 20 13:40:38.085870 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:38.085836 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-794b49f56f-6zmv9" event={"ID":"dbb1277c-2495-4cb6-a7a2-3daef6fa7853","Type":"ContainerStarted","Data":"f9c22425e29adc11ebaad9f42d1ff4f0be43801ea5829bf2915c9d2c0e35d9a4"} Apr 20 13:40:38.086356 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:38.085969 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-794b49f56f-6zmv9" Apr 20 13:40:38.087487 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:38.087463 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-bjvsg" event={"ID":"33b1e997-2189-49f6-b728-00622d7ddbc0","Type":"ContainerStarted","Data":"1b62cb6ce66150539e1ea66084e2a9f53e038e5743c387952882c8c3cf9e958f"} Apr 20 13:40:38.087609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:38.087508 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-bjvsg" Apr 20 13:40:38.087609 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:38.087520 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-bjvsg" podUID="33b1e997-2189-49f6-b728-00622d7ddbc0" containerName="manager" containerID="cri-o://1b62cb6ce66150539e1ea66084e2a9f53e038e5743c387952882c8c3cf9e958f" gracePeriod=10 Apr 20 13:40:38.107006 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:38.106956 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-794b49f56f-6zmv9" podStartSLOduration=1.631244387 podStartE2EDuration="5.106941253s" podCreationTimestamp="2026-04-20 13:40:33 +0000 UTC" firstStartedPulling="2026-04-20 13:40:33.564588151 +0000 UTC m=+590.160279205" lastFinishedPulling="2026-04-20 13:40:37.040285015 +0000 UTC m=+593.635976071" observedRunningTime="2026-04-20 13:40:38.10490637 +0000 UTC m=+594.700597447" watchObservedRunningTime="2026-04-20 13:40:38.106941253 +0000 UTC m=+594.702632351" Apr 20 13:40:38.126353 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:38.126301 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-bjvsg" podStartSLOduration=2.529305261 podStartE2EDuration="6.126285355s" podCreationTimestamp="2026-04-20 13:40:32 +0000 UTC" firstStartedPulling="2026-04-20 13:40:33.399085772 +0000 UTC m=+589.994776825" lastFinishedPulling="2026-04-20 13:40:36.996065866 +0000 UTC m=+593.591756919" observedRunningTime="2026-04-20 13:40:38.124237359 +0000 UTC m=+594.719928436" watchObservedRunningTime="2026-04-20 13:40:38.126285355 +0000 UTC m=+594.721976471" Apr 20 13:40:38.331632 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:38.331609 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-bjvsg" Apr 20 13:40:38.416681 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:38.416591 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49lpj\" (UniqueName: \"kubernetes.io/projected/33b1e997-2189-49f6-b728-00622d7ddbc0-kube-api-access-49lpj\") pod \"33b1e997-2189-49f6-b728-00622d7ddbc0\" (UID: \"33b1e997-2189-49f6-b728-00622d7ddbc0\") " Apr 20 13:40:38.418821 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:38.418792 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33b1e997-2189-49f6-b728-00622d7ddbc0-kube-api-access-49lpj" (OuterVolumeSpecName: "kube-api-access-49lpj") pod "33b1e997-2189-49f6-b728-00622d7ddbc0" (UID: "33b1e997-2189-49f6-b728-00622d7ddbc0"). InnerVolumeSpecName "kube-api-access-49lpj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:40:38.517121 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:38.517088 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-49lpj\" (UniqueName: \"kubernetes.io/projected/33b1e997-2189-49f6-b728-00622d7ddbc0-kube-api-access-49lpj\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:40:39.092943 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:39.092906 2573 generic.go:358] "Generic (PLEG): container finished" podID="33b1e997-2189-49f6-b728-00622d7ddbc0" containerID="1b62cb6ce66150539e1ea66084e2a9f53e038e5743c387952882c8c3cf9e958f" exitCode=0 Apr 20 13:40:39.093363 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:39.092969 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-bjvsg" Apr 20 13:40:39.093363 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:39.092994 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-bjvsg" event={"ID":"33b1e997-2189-49f6-b728-00622d7ddbc0","Type":"ContainerDied","Data":"1b62cb6ce66150539e1ea66084e2a9f53e038e5743c387952882c8c3cf9e958f"} Apr 20 13:40:39.093363 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:39.093036 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-bjvsg" event={"ID":"33b1e997-2189-49f6-b728-00622d7ddbc0","Type":"ContainerDied","Data":"63cdede69257a8bdd9f42191ddcd32f708ba9762f3c7c514cfbeab20ed0d2dbc"} Apr 20 13:40:39.093363 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:39.093053 2573 scope.go:117] "RemoveContainer" containerID="1b62cb6ce66150539e1ea66084e2a9f53e038e5743c387952882c8c3cf9e958f" Apr 20 13:40:39.106904 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:39.106883 2573 scope.go:117] "RemoveContainer" containerID="1b62cb6ce66150539e1ea66084e2a9f53e038e5743c387952882c8c3cf9e958f" Apr 20 13:40:39.107248 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:40:39.107227 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b62cb6ce66150539e1ea66084e2a9f53e038e5743c387952882c8c3cf9e958f\": container with ID starting with 1b62cb6ce66150539e1ea66084e2a9f53e038e5743c387952882c8c3cf9e958f not found: ID does not exist" containerID="1b62cb6ce66150539e1ea66084e2a9f53e038e5743c387952882c8c3cf9e958f" Apr 20 13:40:39.107315 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:39.107257 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b62cb6ce66150539e1ea66084e2a9f53e038e5743c387952882c8c3cf9e958f"} err="failed to get container status \"1b62cb6ce66150539e1ea66084e2a9f53e038e5743c387952882c8c3cf9e958f\": rpc error: code = NotFound desc = could not find container \"1b62cb6ce66150539e1ea66084e2a9f53e038e5743c387952882c8c3cf9e958f\": container with ID starting with 1b62cb6ce66150539e1ea66084e2a9f53e038e5743c387952882c8c3cf9e958f not found: ID does not exist" Apr 20 13:40:39.129349 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:39.129309 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-bjvsg"] Apr 20 13:40:39.135969 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:39.135935 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-bjvsg"] Apr 20 13:40:39.980968 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:39.980935 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33b1e997-2189-49f6-b728-00622d7ddbc0" path="/var/lib/kubelet/pods/33b1e997-2189-49f6-b728-00622d7ddbc0/volumes" Apr 20 13:40:43.910523 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:43.910494 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 13:40:43.910960 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:43.910754 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 13:40:43.913528 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:43.913509 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 13:40:43.913749 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:43.913732 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 13:40:48.047342 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:48.047307 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-794b49f56f-6zmv9"] Apr 20 13:40:48.047793 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:48.047572 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-794b49f56f-6zmv9" podUID="dbb1277c-2495-4cb6-a7a2-3daef6fa7853" containerName="manager" containerID="cri-o://f9c22425e29adc11ebaad9f42d1ff4f0be43801ea5829bf2915c9d2c0e35d9a4" gracePeriod=10 Apr 20 13:40:48.054798 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:48.054772 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-794b49f56f-6zmv9" Apr 20 13:40:48.288205 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:48.288182 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-794b49f56f-6zmv9" Apr 20 13:40:48.397396 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:48.397300 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntfls\" (UniqueName: \"kubernetes.io/projected/dbb1277c-2495-4cb6-a7a2-3daef6fa7853-kube-api-access-ntfls\") pod \"dbb1277c-2495-4cb6-a7a2-3daef6fa7853\" (UID: \"dbb1277c-2495-4cb6-a7a2-3daef6fa7853\") " Apr 20 13:40:48.399414 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:48.399384 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb1277c-2495-4cb6-a7a2-3daef6fa7853-kube-api-access-ntfls" (OuterVolumeSpecName: "kube-api-access-ntfls") pod "dbb1277c-2495-4cb6-a7a2-3daef6fa7853" (UID: "dbb1277c-2495-4cb6-a7a2-3daef6fa7853"). InnerVolumeSpecName "kube-api-access-ntfls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:40:48.498766 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:48.498728 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ntfls\" (UniqueName: \"kubernetes.io/projected/dbb1277c-2495-4cb6-a7a2-3daef6fa7853-kube-api-access-ntfls\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:40:49.134273 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:49.134236 2573 generic.go:358] "Generic (PLEG): container finished" podID="dbb1277c-2495-4cb6-a7a2-3daef6fa7853" containerID="f9c22425e29adc11ebaad9f42d1ff4f0be43801ea5829bf2915c9d2c0e35d9a4" exitCode=0 Apr 20 13:40:49.134697 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:49.134297 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-794b49f56f-6zmv9" Apr 20 13:40:49.134697 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:49.134315 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-794b49f56f-6zmv9" event={"ID":"dbb1277c-2495-4cb6-a7a2-3daef6fa7853","Type":"ContainerDied","Data":"f9c22425e29adc11ebaad9f42d1ff4f0be43801ea5829bf2915c9d2c0e35d9a4"} Apr 20 13:40:49.134697 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:49.134348 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-794b49f56f-6zmv9" event={"ID":"dbb1277c-2495-4cb6-a7a2-3daef6fa7853","Type":"ContainerDied","Data":"4df514af281511236c1b18991397496f5c9aa3440d2670b130e92b5c11dd8c25"} Apr 20 13:40:49.134697 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:49.134365 2573 scope.go:117] "RemoveContainer" containerID="f9c22425e29adc11ebaad9f42d1ff4f0be43801ea5829bf2915c9d2c0e35d9a4" Apr 20 13:40:49.144256 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:49.144236 2573 scope.go:117] "RemoveContainer" containerID="f9c22425e29adc11ebaad9f42d1ff4f0be43801ea5829bf2915c9d2c0e35d9a4" Apr 20 13:40:49.144509 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:40:49.144491 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9c22425e29adc11ebaad9f42d1ff4f0be43801ea5829bf2915c9d2c0e35d9a4\": container with ID starting with f9c22425e29adc11ebaad9f42d1ff4f0be43801ea5829bf2915c9d2c0e35d9a4 not found: ID does not exist" containerID="f9c22425e29adc11ebaad9f42d1ff4f0be43801ea5829bf2915c9d2c0e35d9a4" Apr 20 13:40:49.144565 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:49.144523 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c22425e29adc11ebaad9f42d1ff4f0be43801ea5829bf2915c9d2c0e35d9a4"} err="failed to get container status \"f9c22425e29adc11ebaad9f42d1ff4f0be43801ea5829bf2915c9d2c0e35d9a4\": rpc error: code = NotFound desc = could not find container \"f9c22425e29adc11ebaad9f42d1ff4f0be43801ea5829bf2915c9d2c0e35d9a4\": container with ID starting with f9c22425e29adc11ebaad9f42d1ff4f0be43801ea5829bf2915c9d2c0e35d9a4 not found: ID does not exist" Apr 20 13:40:49.158871 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:49.158841 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-794b49f56f-6zmv9"] Apr 20 13:40:49.169740 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:49.167618 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-794b49f56f-6zmv9"] Apr 20 13:40:49.981175 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:40:49.981120 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbb1277c-2495-4cb6-a7a2-3daef6fa7853" path="/var/lib/kubelet/pods/dbb1277c-2495-4cb6-a7a2-3daef6fa7853/volumes" Apr 20 13:41:18.558161 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.558109 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj"] Apr 20 13:41:18.560792 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.558446 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33b1e997-2189-49f6-b728-00622d7ddbc0" containerName="manager" Apr 20 13:41:18.560792 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.558458 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b1e997-2189-49f6-b728-00622d7ddbc0" containerName="manager" Apr 20 13:41:18.560792 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.558466 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbb1277c-2495-4cb6-a7a2-3daef6fa7853" containerName="manager" Apr 20 13:41:18.560792 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.558471 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb1277c-2495-4cb6-a7a2-3daef6fa7853" containerName="manager" Apr 20 13:41:18.560792 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.558546 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="33b1e997-2189-49f6-b728-00622d7ddbc0" containerName="manager" Apr 20 13:41:18.560792 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.558558 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="dbb1277c-2495-4cb6-a7a2-3daef6fa7853" containerName="manager" Apr 20 13:41:18.561741 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.561725 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.564920 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.564894 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 13:41:18.565722 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.565705 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 13:41:18.565784 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.565705 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 20 13:41:18.565784 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.565707 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-kngc7\"" Apr 20 13:41:18.572996 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.572972 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj"] Apr 20 13:41:18.655103 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.655065 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/194b180e-2fed-4be0-ad56-9b423e81f5ea-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.655318 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.655119 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/194b180e-2fed-4be0-ad56-9b423e81f5ea-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.655318 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.655172 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/194b180e-2fed-4be0-ad56-9b423e81f5ea-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.655318 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.655238 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/194b180e-2fed-4be0-ad56-9b423e81f5ea-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.655318 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.655275 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/194b180e-2fed-4be0-ad56-9b423e81f5ea-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.655482 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.655319 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrr2c\" (UniqueName: \"kubernetes.io/projected/194b180e-2fed-4be0-ad56-9b423e81f5ea-kube-api-access-zrr2c\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.756379 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.756335 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/194b180e-2fed-4be0-ad56-9b423e81f5ea-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.756379 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.756381 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/194b180e-2fed-4be0-ad56-9b423e81f5ea-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.756670 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.756435 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrr2c\" (UniqueName: \"kubernetes.io/projected/194b180e-2fed-4be0-ad56-9b423e81f5ea-kube-api-access-zrr2c\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.756670 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.756464 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/194b180e-2fed-4be0-ad56-9b423e81f5ea-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.756670 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.756509 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/194b180e-2fed-4be0-ad56-9b423e81f5ea-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.756670 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.756543 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/194b180e-2fed-4be0-ad56-9b423e81f5ea-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.756980 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.756956 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/194b180e-2fed-4be0-ad56-9b423e81f5ea-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.756980 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.756974 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/194b180e-2fed-4be0-ad56-9b423e81f5ea-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.757078 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.757061 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/194b180e-2fed-4be0-ad56-9b423e81f5ea-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.758684 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.758659 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/194b180e-2fed-4be0-ad56-9b423e81f5ea-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.758944 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.758925 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/194b180e-2fed-4be0-ad56-9b423e81f5ea-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.764599 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.764576 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrr2c\" (UniqueName: \"kubernetes.io/projected/194b180e-2fed-4be0-ad56-9b423e81f5ea-kube-api-access-zrr2c\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hsjrj\" (UID: \"194b180e-2fed-4be0-ad56-9b423e81f5ea\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:18.872022 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:18.871937 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:19.003203 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:19.003177 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj"] Apr 20 13:41:19.005109 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:41:19.005066 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194b180e_2fed_4be0_ad56_9b423e81f5ea.slice/crio-345a28bd4aa8cea2b9a71163800ea7f3488540011f66ecf78749ba688a563214 WatchSource:0}: Error finding container 345a28bd4aa8cea2b9a71163800ea7f3488540011f66ecf78749ba688a563214: Status 404 returned error can't find the container with id 345a28bd4aa8cea2b9a71163800ea7f3488540011f66ecf78749ba688a563214 Apr 20 13:41:19.006980 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:19.006958 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 13:41:19.251206 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:19.251111 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" event={"ID":"194b180e-2fed-4be0-ad56-9b423e81f5ea","Type":"ContainerStarted","Data":"345a28bd4aa8cea2b9a71163800ea7f3488540011f66ecf78749ba688a563214"} Apr 20 13:41:20.878330 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:20.878295 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-74df864bdc-7jpsf"] Apr 20 13:41:20.882240 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:20.882218 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-74df864bdc-7jpsf" Apr 20 13:41:20.885648 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:20.885623 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 13:41:20.885648 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:20.885639 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 13:41:20.885841 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:20.885695 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-hn6q8\"" Apr 20 13:41:20.891263 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:20.891241 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-74df864bdc-7jpsf"] Apr 20 13:41:20.975853 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:20.975753 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r744r\" (UniqueName: \"kubernetes.io/projected/82fe8d2e-0333-4462-aed7-6444cc682cff-kube-api-access-r744r\") pod \"maas-api-74df864bdc-7jpsf\" (UID: \"82fe8d2e-0333-4462-aed7-6444cc682cff\") " pod="opendatahub/maas-api-74df864bdc-7jpsf" Apr 20 13:41:20.976026 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:20.975880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/82fe8d2e-0333-4462-aed7-6444cc682cff-maas-api-tls\") pod \"maas-api-74df864bdc-7jpsf\" (UID: \"82fe8d2e-0333-4462-aed7-6444cc682cff\") " pod="opendatahub/maas-api-74df864bdc-7jpsf" Apr 20 13:41:21.077367 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:21.077323 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/82fe8d2e-0333-4462-aed7-6444cc682cff-maas-api-tls\") pod \"maas-api-74df864bdc-7jpsf\" (UID: \"82fe8d2e-0333-4462-aed7-6444cc682cff\") " pod="opendatahub/maas-api-74df864bdc-7jpsf" Apr 20 13:41:21.077597 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:21.077574 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r744r\" (UniqueName: \"kubernetes.io/projected/82fe8d2e-0333-4462-aed7-6444cc682cff-kube-api-access-r744r\") pod \"maas-api-74df864bdc-7jpsf\" (UID: \"82fe8d2e-0333-4462-aed7-6444cc682cff\") " pod="opendatahub/maas-api-74df864bdc-7jpsf" Apr 20 13:41:21.080249 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:21.080224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/82fe8d2e-0333-4462-aed7-6444cc682cff-maas-api-tls\") pod \"maas-api-74df864bdc-7jpsf\" (UID: \"82fe8d2e-0333-4462-aed7-6444cc682cff\") " pod="opendatahub/maas-api-74df864bdc-7jpsf" Apr 20 13:41:21.085977 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:21.085955 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r744r\" (UniqueName: \"kubernetes.io/projected/82fe8d2e-0333-4462-aed7-6444cc682cff-kube-api-access-r744r\") pod \"maas-api-74df864bdc-7jpsf\" (UID: \"82fe8d2e-0333-4462-aed7-6444cc682cff\") " pod="opendatahub/maas-api-74df864bdc-7jpsf" Apr 20 13:41:21.198299 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:21.198265 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-74df864bdc-7jpsf" Apr 20 13:41:21.353735 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:21.353708 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-74df864bdc-7jpsf"] Apr 20 13:41:21.355923 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:41:21.355888 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82fe8d2e_0333_4462_aed7_6444cc682cff.slice/crio-544242e696e1a022fdf9827b23174ce1fd5885d4aa1ac7f147e5a5a8cd4a7b05 WatchSource:0}: Error finding container 544242e696e1a022fdf9827b23174ce1fd5885d4aa1ac7f147e5a5a8cd4a7b05: Status 404 returned error can't find the container with id 544242e696e1a022fdf9827b23174ce1fd5885d4aa1ac7f147e5a5a8cd4a7b05 Apr 20 13:41:22.267807 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:22.267771 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-74df864bdc-7jpsf" event={"ID":"82fe8d2e-0333-4462-aed7-6444cc682cff","Type":"ContainerStarted","Data":"544242e696e1a022fdf9827b23174ce1fd5885d4aa1ac7f147e5a5a8cd4a7b05"} Apr 20 13:41:24.279763 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:24.279720 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" event={"ID":"194b180e-2fed-4be0-ad56-9b423e81f5ea","Type":"ContainerStarted","Data":"dbdeddd62d63fbed9427701610c122625b5593039cf74e7eea691b01bc481b67"} Apr 20 13:41:25.284686 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:25.284629 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-74df864bdc-7jpsf" event={"ID":"82fe8d2e-0333-4462-aed7-6444cc682cff","Type":"ContainerStarted","Data":"cad28c5dba484cdb94254e6337cb375aafffb26432577f0003bbf8ef363f51cc"} Apr 20 13:41:25.285166 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:25.284710 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-74df864bdc-7jpsf" Apr 20 13:41:25.304840 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:25.304775 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-74df864bdc-7jpsf" podStartSLOduration=1.522495115 podStartE2EDuration="5.30475904s" podCreationTimestamp="2026-04-20 13:41:20 +0000 UTC" firstStartedPulling="2026-04-20 13:41:21.3575385 +0000 UTC m=+637.953229553" lastFinishedPulling="2026-04-20 13:41:25.139802425 +0000 UTC m=+641.735493478" observedRunningTime="2026-04-20 13:41:25.301618029 +0000 UTC m=+641.897309120" watchObservedRunningTime="2026-04-20 13:41:25.30475904 +0000 UTC m=+641.900450116" Apr 20 13:41:30.306393 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:30.306349 2573 generic.go:358] "Generic (PLEG): container finished" podID="194b180e-2fed-4be0-ad56-9b423e81f5ea" containerID="dbdeddd62d63fbed9427701610c122625b5593039cf74e7eea691b01bc481b67" exitCode=0 Apr 20 13:41:30.306798 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:30.306429 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" event={"ID":"194b180e-2fed-4be0-ad56-9b423e81f5ea","Type":"ContainerDied","Data":"dbdeddd62d63fbed9427701610c122625b5593039cf74e7eea691b01bc481b67"} Apr 20 13:41:31.296270 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:31.296237 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-74df864bdc-7jpsf" Apr 20 13:41:32.316358 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:32.316319 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" event={"ID":"194b180e-2fed-4be0-ad56-9b423e81f5ea","Type":"ContainerStarted","Data":"654963c1216221451624f8535477a90f88839bf18c0ac05a1390fd68071f9071"} Apr 20 13:41:32.316803 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:32.316528 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:41:32.337812 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:32.337758 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" podStartSLOduration=1.90794415 podStartE2EDuration="14.337745341s" podCreationTimestamp="2026-04-20 13:41:18 +0000 UTC" firstStartedPulling="2026-04-20 13:41:19.007080387 +0000 UTC m=+635.602771442" lastFinishedPulling="2026-04-20 13:41:31.436881578 +0000 UTC m=+648.032572633" observedRunningTime="2026-04-20 13:41:32.334887767 +0000 UTC m=+648.930578844" watchObservedRunningTime="2026-04-20 13:41:32.337745341 +0000 UTC m=+648.933436416" Apr 20 13:41:43.337124 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:41:43.337092 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hsjrj" Apr 20 13:42:13.956300 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:13.956263 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff"] Apr 20 13:42:13.959755 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:13.959734 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:13.962069 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:13.962048 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 13:42:13.969088 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:13.969064 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff"] Apr 20 13:42:13.984869 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:13.984840 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztkrn\" (UniqueName: \"kubernetes.io/projected/3f18fcad-736b-448e-8a01-e59a0fd9fb22-kube-api-access-ztkrn\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:13.984869 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:13.984873 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f18fcad-736b-448e-8a01-e59a0fd9fb22-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:13.985087 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:13.984908 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f18fcad-736b-448e-8a01-e59a0fd9fb22-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:13.985087 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:13.984969 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3f18fcad-736b-448e-8a01-e59a0fd9fb22-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:13.985087 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:13.985016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f18fcad-736b-448e-8a01-e59a0fd9fb22-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:13.985087 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:13.985041 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3f18fcad-736b-448e-8a01-e59a0fd9fb22-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:14.085461 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.085427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f18fcad-736b-448e-8a01-e59a0fd9fb22-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:14.085652 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.085475 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3f18fcad-736b-448e-8a01-e59a0fd9fb22-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:14.085652 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.085524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f18fcad-736b-448e-8a01-e59a0fd9fb22-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:14.085652 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.085551 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3f18fcad-736b-448e-8a01-e59a0fd9fb22-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:14.085652 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.085589 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztkrn\" (UniqueName: \"kubernetes.io/projected/3f18fcad-736b-448e-8a01-e59a0fd9fb22-kube-api-access-ztkrn\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:14.085652 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.085610 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f18fcad-736b-448e-8a01-e59a0fd9fb22-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:14.085967 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.085946 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3f18fcad-736b-448e-8a01-e59a0fd9fb22-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:14.086022 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.086003 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f18fcad-736b-448e-8a01-e59a0fd9fb22-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:14.086022 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.086011 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f18fcad-736b-448e-8a01-e59a0fd9fb22-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:14.088300 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.088273 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3f18fcad-736b-448e-8a01-e59a0fd9fb22-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:14.088505 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.088486 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f18fcad-736b-448e-8a01-e59a0fd9fb22-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:14.094034 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.094004 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztkrn\" (UniqueName: \"kubernetes.io/projected/3f18fcad-736b-448e-8a01-e59a0fd9fb22-kube-api-access-ztkrn\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7twff\" (UID: \"3f18fcad-736b-448e-8a01-e59a0fd9fb22\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:14.270548 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.270449 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:14.402186 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.402134 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff"] Apr 20 13:42:14.404087 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:42:14.404061 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f18fcad_736b_448e_8a01_e59a0fd9fb22.slice/crio-20736f58aba2dff3e17b8b9f562b28116199803f32282707672998ec3a509bf7 WatchSource:0}: Error finding container 20736f58aba2dff3e17b8b9f562b28116199803f32282707672998ec3a509bf7: Status 404 returned error can't find the container with id 20736f58aba2dff3e17b8b9f562b28116199803f32282707672998ec3a509bf7 Apr 20 13:42:14.478992 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.478956 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" event={"ID":"3f18fcad-736b-448e-8a01-e59a0fd9fb22","Type":"ContainerStarted","Data":"7a0358d0a1cc53f0b5cccd2c637f30b0e5affcb618c6fdf29ce0908e7d7425bc"} Apr 20 13:42:14.479186 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:14.478999 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" event={"ID":"3f18fcad-736b-448e-8a01-e59a0fd9fb22","Type":"ContainerStarted","Data":"20736f58aba2dff3e17b8b9f562b28116199803f32282707672998ec3a509bf7"} Apr 20 13:42:20.502838 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:20.502801 2573 generic.go:358] "Generic (PLEG): container finished" podID="3f18fcad-736b-448e-8a01-e59a0fd9fb22" containerID="7a0358d0a1cc53f0b5cccd2c637f30b0e5affcb618c6fdf29ce0908e7d7425bc" exitCode=0 Apr 20 13:42:20.503262 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:20.502872 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" event={"ID":"3f18fcad-736b-448e-8a01-e59a0fd9fb22","Type":"ContainerDied","Data":"7a0358d0a1cc53f0b5cccd2c637f30b0e5affcb618c6fdf29ce0908e7d7425bc"} Apr 20 13:42:21.508874 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:21.508840 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" event={"ID":"3f18fcad-736b-448e-8a01-e59a0fd9fb22","Type":"ContainerStarted","Data":"62228eeac6e3fa1d680c79615548a5fca86e3cb4476bd87a63a669ce50d4b4e9"} Apr 20 13:42:21.509352 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:21.509061 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:21.527200 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:21.527130 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" podStartSLOduration=8.27656329 podStartE2EDuration="8.527116201s" podCreationTimestamp="2026-04-20 13:42:13 +0000 UTC" firstStartedPulling="2026-04-20 13:42:20.503603269 +0000 UTC m=+697.099294323" lastFinishedPulling="2026-04-20 13:42:20.754156178 +0000 UTC m=+697.349847234" observedRunningTime="2026-04-20 13:42:21.52643095 +0000 UTC m=+698.122122028" watchObservedRunningTime="2026-04-20 13:42:21.527116201 +0000 UTC m=+698.122807276" Apr 20 13:42:32.524841 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:32.524765 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7twff" Apr 20 13:42:46.062011 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:46.061975 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-76f7db8cbd-fbrgf"] Apr 20 13:42:46.065510 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:46.065493 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-76f7db8cbd-fbrgf" Apr 20 13:42:46.074339 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:46.074313 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-76f7db8cbd-fbrgf"] Apr 20 13:42:46.146438 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:46.146404 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/52a32186-8c5b-49a7-b14c-232ae9e04531-tls-cert\") pod \"authorino-76f7db8cbd-fbrgf\" (UID: \"52a32186-8c5b-49a7-b14c-232ae9e04531\") " pod="kuadrant-system/authorino-76f7db8cbd-fbrgf" Apr 20 13:42:46.146614 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:46.146455 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hwpj\" (UniqueName: \"kubernetes.io/projected/52a32186-8c5b-49a7-b14c-232ae9e04531-kube-api-access-6hwpj\") pod \"authorino-76f7db8cbd-fbrgf\" (UID: \"52a32186-8c5b-49a7-b14c-232ae9e04531\") " pod="kuadrant-system/authorino-76f7db8cbd-fbrgf" Apr 20 13:42:46.247155 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:46.247117 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/52a32186-8c5b-49a7-b14c-232ae9e04531-tls-cert\") pod \"authorino-76f7db8cbd-fbrgf\" (UID: \"52a32186-8c5b-49a7-b14c-232ae9e04531\") " pod="kuadrant-system/authorino-76f7db8cbd-fbrgf" Apr 20 13:42:46.247340 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:46.247193 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hwpj\" (UniqueName: \"kubernetes.io/projected/52a32186-8c5b-49a7-b14c-232ae9e04531-kube-api-access-6hwpj\") pod \"authorino-76f7db8cbd-fbrgf\" (UID: \"52a32186-8c5b-49a7-b14c-232ae9e04531\") " pod="kuadrant-system/authorino-76f7db8cbd-fbrgf" Apr 20 13:42:46.249666 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:46.249645 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/52a32186-8c5b-49a7-b14c-232ae9e04531-tls-cert\") pod \"authorino-76f7db8cbd-fbrgf\" (UID: \"52a32186-8c5b-49a7-b14c-232ae9e04531\") " pod="kuadrant-system/authorino-76f7db8cbd-fbrgf" Apr 20 13:42:46.255878 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:46.255855 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hwpj\" (UniqueName: \"kubernetes.io/projected/52a32186-8c5b-49a7-b14c-232ae9e04531-kube-api-access-6hwpj\") pod \"authorino-76f7db8cbd-fbrgf\" (UID: \"52a32186-8c5b-49a7-b14c-232ae9e04531\") " pod="kuadrant-system/authorino-76f7db8cbd-fbrgf" Apr 20 13:42:46.375653 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:46.375570 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-76f7db8cbd-fbrgf" Apr 20 13:42:46.508992 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:46.508966 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-76f7db8cbd-fbrgf"] Apr 20 13:42:46.510747 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:42:46.510714 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52a32186_8c5b_49a7_b14c_232ae9e04531.slice/crio-b99f4dfed7a45b50e978fecbe16beef85b6a8bff12f22f0d89b787728a5dc64c WatchSource:0}: Error finding container b99f4dfed7a45b50e978fecbe16beef85b6a8bff12f22f0d89b787728a5dc64c: Status 404 returned error can't find the container with id b99f4dfed7a45b50e978fecbe16beef85b6a8bff12f22f0d89b787728a5dc64c Apr 20 13:42:46.606332 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:46.606296 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-76f7db8cbd-fbrgf" event={"ID":"52a32186-8c5b-49a7-b14c-232ae9e04531","Type":"ContainerStarted","Data":"b99f4dfed7a45b50e978fecbe16beef85b6a8bff12f22f0d89b787728a5dc64c"} Apr 20 13:42:47.631800 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:47.631755 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-76f7db8cbd-fbrgf" event={"ID":"52a32186-8c5b-49a7-b14c-232ae9e04531","Type":"ContainerStarted","Data":"a1f699a854a9ebbbd3a249efab4df76c717d2dc22df2757afbce6a47478d40cd"} Apr 20 13:42:47.664505 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:47.664435 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-76f7db8cbd-fbrgf" podStartSLOduration=1.225037656 podStartE2EDuration="1.664412601s" podCreationTimestamp="2026-04-20 13:42:46 +0000 UTC" firstStartedPulling="2026-04-20 13:42:46.512013794 +0000 UTC m=+723.107704848" lastFinishedPulling="2026-04-20 13:42:46.951388739 +0000 UTC m=+723.547079793" observedRunningTime="2026-04-20 13:42:47.659340302 +0000 UTC m=+724.255031391" watchObservedRunningTime="2026-04-20 13:42:47.664412601 +0000 UTC m=+724.260103679" Apr 20 13:42:47.696758 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:47.696721 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-876f8b889-wnm5f"] Apr 20 13:42:47.696983 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:47.696959 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-876f8b889-wnm5f" podUID="a149d8d3-6df7-4813-995b-6647fced3c56" containerName="authorino" containerID="cri-o://3c7442db031bfe31495384948d004804519622187359228e83424d90c38629bd" gracePeriod=30 Apr 20 13:42:47.937777 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:47.937743 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-876f8b889-wnm5f" Apr 20 13:42:48.061250 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:48.061217 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdr69\" (UniqueName: \"kubernetes.io/projected/a149d8d3-6df7-4813-995b-6647fced3c56-kube-api-access-qdr69\") pod \"a149d8d3-6df7-4813-995b-6647fced3c56\" (UID: \"a149d8d3-6df7-4813-995b-6647fced3c56\") " Apr 20 13:42:48.061414 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:48.061323 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a149d8d3-6df7-4813-995b-6647fced3c56-tls-cert\") pod \"a149d8d3-6df7-4813-995b-6647fced3c56\" (UID: \"a149d8d3-6df7-4813-995b-6647fced3c56\") " Apr 20 13:42:48.063343 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:48.063315 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a149d8d3-6df7-4813-995b-6647fced3c56-kube-api-access-qdr69" (OuterVolumeSpecName: "kube-api-access-qdr69") pod "a149d8d3-6df7-4813-995b-6647fced3c56" (UID: "a149d8d3-6df7-4813-995b-6647fced3c56"). InnerVolumeSpecName "kube-api-access-qdr69". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:42:48.071386 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:48.071365 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a149d8d3-6df7-4813-995b-6647fced3c56-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "a149d8d3-6df7-4813-995b-6647fced3c56" (UID: "a149d8d3-6df7-4813-995b-6647fced3c56"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:42:48.162233 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:48.162200 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qdr69\" (UniqueName: \"kubernetes.io/projected/a149d8d3-6df7-4813-995b-6647fced3c56-kube-api-access-qdr69\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:42:48.162233 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:48.162228 2573 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a149d8d3-6df7-4813-995b-6647fced3c56-tls-cert\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:42:48.636714 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:48.636683 2573 generic.go:358] "Generic (PLEG): container finished" podID="a149d8d3-6df7-4813-995b-6647fced3c56" containerID="3c7442db031bfe31495384948d004804519622187359228e83424d90c38629bd" exitCode=0 Apr 20 13:42:48.637212 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:48.636728 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-876f8b889-wnm5f" Apr 20 13:42:48.637212 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:48.636771 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-876f8b889-wnm5f" event={"ID":"a149d8d3-6df7-4813-995b-6647fced3c56","Type":"ContainerDied","Data":"3c7442db031bfe31495384948d004804519622187359228e83424d90c38629bd"} Apr 20 13:42:48.637212 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:48.636808 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-876f8b889-wnm5f" event={"ID":"a149d8d3-6df7-4813-995b-6647fced3c56","Type":"ContainerDied","Data":"71094a35fd78d179df7d19afb555d370ee90525fbbb55bfc4ad1f767fc3b3f55"} Apr 20 13:42:48.637212 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:48.636825 2573 scope.go:117] "RemoveContainer" containerID="3c7442db031bfe31495384948d004804519622187359228e83424d90c38629bd" Apr 20 13:42:48.646322 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:48.646304 2573 scope.go:117] "RemoveContainer" containerID="3c7442db031bfe31495384948d004804519622187359228e83424d90c38629bd" Apr 20 13:42:48.646552 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:42:48.646537 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c7442db031bfe31495384948d004804519622187359228e83424d90c38629bd\": container with ID starting with 3c7442db031bfe31495384948d004804519622187359228e83424d90c38629bd not found: ID does not exist" containerID="3c7442db031bfe31495384948d004804519622187359228e83424d90c38629bd" Apr 20 13:42:48.646608 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:48.646558 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7442db031bfe31495384948d004804519622187359228e83424d90c38629bd"} err="failed to get container status \"3c7442db031bfe31495384948d004804519622187359228e83424d90c38629bd\": rpc error: code = NotFound desc = could not find container \"3c7442db031bfe31495384948d004804519622187359228e83424d90c38629bd\": container with ID starting with 3c7442db031bfe31495384948d004804519622187359228e83424d90c38629bd not found: ID does not exist" Apr 20 13:42:48.657488 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:48.657462 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-876f8b889-wnm5f"] Apr 20 13:42:48.661278 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:48.661256 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-876f8b889-wnm5f"] Apr 20 13:42:49.981581 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:42:49.981550 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a149d8d3-6df7-4813-995b-6647fced3c56" path="/var/lib/kubelet/pods/a149d8d3-6df7-4813-995b-6647fced3c56/volumes" Apr 20 13:44:13.601520 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:13.601438 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-67b98965d5-msrm8"] Apr 20 13:44:13.602026 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:13.601812 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a149d8d3-6df7-4813-995b-6647fced3c56" containerName="authorino" Apr 20 13:44:13.602026 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:13.601824 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a149d8d3-6df7-4813-995b-6647fced3c56" containerName="authorino" Apr 20 13:44:13.602026 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:13.601887 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a149d8d3-6df7-4813-995b-6647fced3c56" containerName="authorino" Apr 20 13:44:13.604831 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:13.604812 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67b98965d5-msrm8" Apr 20 13:44:13.607066 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:13.607048 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-d7t72\"" Apr 20 13:44:13.615659 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:13.615636 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-67b98965d5-msrm8"] Apr 20 13:44:13.706438 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:13.706403 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5drk\" (UniqueName: \"kubernetes.io/projected/26e2183b-e192-4c7c-829d-503b4c066515-kube-api-access-x5drk\") pod \"maas-controller-67b98965d5-msrm8\" (UID: \"26e2183b-e192-4c7c-829d-503b4c066515\") " pod="opendatahub/maas-controller-67b98965d5-msrm8" Apr 20 13:44:13.807019 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:13.806972 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5drk\" (UniqueName: \"kubernetes.io/projected/26e2183b-e192-4c7c-829d-503b4c066515-kube-api-access-x5drk\") pod \"maas-controller-67b98965d5-msrm8\" (UID: \"26e2183b-e192-4c7c-829d-503b4c066515\") " pod="opendatahub/maas-controller-67b98965d5-msrm8" Apr 20 13:44:13.815930 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:13.815899 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5drk\" (UniqueName: \"kubernetes.io/projected/26e2183b-e192-4c7c-829d-503b4c066515-kube-api-access-x5drk\") pod \"maas-controller-67b98965d5-msrm8\" (UID: \"26e2183b-e192-4c7c-829d-503b4c066515\") " pod="opendatahub/maas-controller-67b98965d5-msrm8" Apr 20 13:44:13.914895 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:13.914815 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67b98965d5-msrm8" Apr 20 13:44:14.046475 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:14.046443 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-67b98965d5-msrm8"] Apr 20 13:44:14.048711 ip-10-0-142-144 kubenswrapper[2573]: W0420 13:44:14.048680 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26e2183b_e192_4c7c_829d_503b4c066515.slice/crio-3b4d01533694faa5f9c3dfc80db71802eea89710aaea672e0de27096dcad8164 WatchSource:0}: Error finding container 3b4d01533694faa5f9c3dfc80db71802eea89710aaea672e0de27096dcad8164: Status 404 returned error can't find the container with id 3b4d01533694faa5f9c3dfc80db71802eea89710aaea672e0de27096dcad8164 Apr 20 13:44:14.992006 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:14.991971 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-67b98965d5-msrm8" event={"ID":"26e2183b-e192-4c7c-829d-503b4c066515","Type":"ContainerStarted","Data":"970bcf99e501b8093966a712cd1f8d35d2a03f32f66bb99f6a35164c9670db69"} Apr 20 13:44:14.992006 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:14.992007 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-67b98965d5-msrm8" event={"ID":"26e2183b-e192-4c7c-829d-503b4c066515","Type":"ContainerStarted","Data":"3b4d01533694faa5f9c3dfc80db71802eea89710aaea672e0de27096dcad8164"} Apr 20 13:44:14.992487 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:14.992119 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-67b98965d5-msrm8" Apr 20 13:44:15.012828 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:15.012779 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-67b98965d5-msrm8" podStartSLOduration=1.594200339 podStartE2EDuration="2.012762015s" podCreationTimestamp="2026-04-20 13:44:13 +0000 UTC" firstStartedPulling="2026-04-20 13:44:14.049930039 +0000 UTC m=+810.645621093" lastFinishedPulling="2026-04-20 13:44:14.468491715 +0000 UTC m=+811.064182769" observedRunningTime="2026-04-20 13:44:15.009878734 +0000 UTC m=+811.605569809" watchObservedRunningTime="2026-04-20 13:44:15.012762015 +0000 UTC m=+811.608453091" Apr 20 13:44:26.001530 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:44:26.001496 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-67b98965d5-msrm8" Apr 20 13:45:43.943292 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:45:43.943217 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 13:45:43.943766 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:45:43.943583 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 13:45:43.946041 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:45:43.946016 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 13:45:43.946295 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:45:43.946278 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 13:50:43.973072 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:50:43.973044 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 13:50:43.975735 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:50:43.975715 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 13:50:43.975907 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:50:43.975889 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 13:50:43.978745 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:50:43.978725 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 13:55:44.008730 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:55:44.008697 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 13:55:44.011852 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:55:44.011829 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 13:55:44.013166 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:55:44.013128 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 13:55:44.016087 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:55:44.016065 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 13:59:01.846761 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:01.846721 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn"] Apr 20 13:59:01.849346 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:01.846961 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" podUID="0dd8f1c0-f2de-4064-be40-0f7f772b58e4" containerName="manager" containerID="cri-o://8de79ac9abc9e584960975da2f38169651177abff14269476eeba5b5c487c1ab" gracePeriod=10 Apr 20 13:59:02.195770 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.195747 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" Apr 20 13:59:02.334588 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.334557 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz7c7\" (UniqueName: \"kubernetes.io/projected/0dd8f1c0-f2de-4064-be40-0f7f772b58e4-kube-api-access-hz7c7\") pod \"0dd8f1c0-f2de-4064-be40-0f7f772b58e4\" (UID: \"0dd8f1c0-f2de-4064-be40-0f7f772b58e4\") " Apr 20 13:59:02.334788 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.334609 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0dd8f1c0-f2de-4064-be40-0f7f772b58e4-extensions-socket-volume\") pod \"0dd8f1c0-f2de-4064-be40-0f7f772b58e4\" (UID: \"0dd8f1c0-f2de-4064-be40-0f7f772b58e4\") " Apr 20 13:59:02.335024 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.334998 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dd8f1c0-f2de-4064-be40-0f7f772b58e4-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "0dd8f1c0-f2de-4064-be40-0f7f772b58e4" (UID: "0dd8f1c0-f2de-4064-be40-0f7f772b58e4"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:59:02.337106 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.337081 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd8f1c0-f2de-4064-be40-0f7f772b58e4-kube-api-access-hz7c7" (OuterVolumeSpecName: "kube-api-access-hz7c7") pod "0dd8f1c0-f2de-4064-be40-0f7f772b58e4" (UID: "0dd8f1c0-f2de-4064-be40-0f7f772b58e4"). InnerVolumeSpecName "kube-api-access-hz7c7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:59:02.435438 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.435349 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hz7c7\" (UniqueName: \"kubernetes.io/projected/0dd8f1c0-f2de-4064-be40-0f7f772b58e4-kube-api-access-hz7c7\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:59:02.435438 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.435377 2573 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0dd8f1c0-f2de-4064-be40-0f7f772b58e4-extensions-socket-volume\") on node \"ip-10-0-142-144.ec2.internal\" DevicePath \"\"" Apr 20 13:59:02.439397 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.439368 2573 generic.go:358] "Generic (PLEG): container finished" podID="0dd8f1c0-f2de-4064-be40-0f7f772b58e4" containerID="8de79ac9abc9e584960975da2f38169651177abff14269476eeba5b5c487c1ab" exitCode=0 Apr 20 13:59:02.439515 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.439442 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" Apr 20 13:59:02.439515 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.439456 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" event={"ID":"0dd8f1c0-f2de-4064-be40-0f7f772b58e4","Type":"ContainerDied","Data":"8de79ac9abc9e584960975da2f38169651177abff14269476eeba5b5c487c1ab"} Apr 20 13:59:02.439515 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.439494 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn" event={"ID":"0dd8f1c0-f2de-4064-be40-0f7f772b58e4","Type":"ContainerDied","Data":"b84ef8bc51eafcb7f5fb0b30627d1e9af5d756465fb59ea407f3ec4a2999e92c"} Apr 20 13:59:02.439649 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.439515 2573 scope.go:117] "RemoveContainer" containerID="8de79ac9abc9e584960975da2f38169651177abff14269476eeba5b5c487c1ab" Apr 20 13:59:02.449197 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.449174 2573 scope.go:117] "RemoveContainer" containerID="8de79ac9abc9e584960975da2f38169651177abff14269476eeba5b5c487c1ab" Apr 20 13:59:02.449457 ip-10-0-142-144 kubenswrapper[2573]: E0420 13:59:02.449439 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8de79ac9abc9e584960975da2f38169651177abff14269476eeba5b5c487c1ab\": container with ID starting with 8de79ac9abc9e584960975da2f38169651177abff14269476eeba5b5c487c1ab not found: ID does not exist" containerID="8de79ac9abc9e584960975da2f38169651177abff14269476eeba5b5c487c1ab" Apr 20 13:59:02.449508 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.449467 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8de79ac9abc9e584960975da2f38169651177abff14269476eeba5b5c487c1ab"} err="failed to get container status \"8de79ac9abc9e584960975da2f38169651177abff14269476eeba5b5c487c1ab\": rpc error: code = NotFound desc = could not find container \"8de79ac9abc9e584960975da2f38169651177abff14269476eeba5b5c487c1ab\": container with ID starting with 8de79ac9abc9e584960975da2f38169651177abff14269476eeba5b5c487c1ab not found: ID does not exist" Apr 20 13:59:02.463059 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.463029 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn"] Apr 20 13:59:02.465356 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:02.465310 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p2gpn"] Apr 20 13:59:03.980887 ip-10-0-142-144 kubenswrapper[2573]: I0420 13:59:03.980855 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd8f1c0-f2de-4064-be40-0f7f772b58e4" path="/var/lib/kubelet/pods/0dd8f1c0-f2de-4064-be40-0f7f772b58e4/volumes" Apr 20 14:00:07.946470 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:07.946431 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv"] Apr 20 14:00:07.946882 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:07.946775 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0dd8f1c0-f2de-4064-be40-0f7f772b58e4" containerName="manager" Apr 20 14:00:07.946882 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:07.946788 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd8f1c0-f2de-4064-be40-0f7f772b58e4" containerName="manager" Apr 20 14:00:07.946882 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:07.946856 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0dd8f1c0-f2de-4064-be40-0f7f772b58e4" containerName="manager" Apr 20 14:00:07.950267 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:07.950252 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv" Apr 20 14:00:07.952839 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:07.952822 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-ckqz8\"" Apr 20 14:00:07.962687 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:07.962664 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv"] Apr 20 14:00:08.098706 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:08.098668 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b8d8629a-88a8-4fdf-91a3-1a0cf51a4119-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-v5vlv\" (UID: \"b8d8629a-88a8-4fdf-91a3-1a0cf51a4119\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv" Apr 20 14:00:08.098884 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:08.098744 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm7qw\" (UniqueName: \"kubernetes.io/projected/b8d8629a-88a8-4fdf-91a3-1a0cf51a4119-kube-api-access-sm7qw\") pod \"kuadrant-operator-controller-manager-55c7f4c975-v5vlv\" (UID: \"b8d8629a-88a8-4fdf-91a3-1a0cf51a4119\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv" Apr 20 14:00:08.199473 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:08.199399 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b8d8629a-88a8-4fdf-91a3-1a0cf51a4119-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-v5vlv\" (UID: \"b8d8629a-88a8-4fdf-91a3-1a0cf51a4119\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv" Apr 20 14:00:08.199596 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:08.199488 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sm7qw\" (UniqueName: \"kubernetes.io/projected/b8d8629a-88a8-4fdf-91a3-1a0cf51a4119-kube-api-access-sm7qw\") pod \"kuadrant-operator-controller-manager-55c7f4c975-v5vlv\" (UID: \"b8d8629a-88a8-4fdf-91a3-1a0cf51a4119\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv" Apr 20 14:00:08.199788 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:08.199769 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b8d8629a-88a8-4fdf-91a3-1a0cf51a4119-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-v5vlv\" (UID: \"b8d8629a-88a8-4fdf-91a3-1a0cf51a4119\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv" Apr 20 14:00:08.210864 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:08.210835 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm7qw\" (UniqueName: \"kubernetes.io/projected/b8d8629a-88a8-4fdf-91a3-1a0cf51a4119-kube-api-access-sm7qw\") pod \"kuadrant-operator-controller-manager-55c7f4c975-v5vlv\" (UID: \"b8d8629a-88a8-4fdf-91a3-1a0cf51a4119\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv" Apr 20 14:00:08.260454 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:08.260413 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv" Apr 20 14:00:08.392598 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:08.392570 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv"] Apr 20 14:00:08.394119 ip-10-0-142-144 kubenswrapper[2573]: W0420 14:00:08.394091 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8d8629a_88a8_4fdf_91a3_1a0cf51a4119.slice/crio-24bbd975a1ff44f61d9a576e3a79a8e2a3cca7d72609910284eb0ae73f4554fa WatchSource:0}: Error finding container 24bbd975a1ff44f61d9a576e3a79a8e2a3cca7d72609910284eb0ae73f4554fa: Status 404 returned error can't find the container with id 24bbd975a1ff44f61d9a576e3a79a8e2a3cca7d72609910284eb0ae73f4554fa Apr 20 14:00:08.396282 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:08.396264 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:00:08.694113 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:08.694071 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv" event={"ID":"b8d8629a-88a8-4fdf-91a3-1a0cf51a4119","Type":"ContainerStarted","Data":"7581fca37d9ac9d578156c117d56737b69e9dc540944ec6e57a2551a931ccb90"} Apr 20 14:00:08.694113 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:08.694117 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv" event={"ID":"b8d8629a-88a8-4fdf-91a3-1a0cf51a4119","Type":"ContainerStarted","Data":"24bbd975a1ff44f61d9a576e3a79a8e2a3cca7d72609910284eb0ae73f4554fa"} Apr 20 14:00:08.694355 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:08.694208 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv" Apr 20 14:00:08.721163 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:08.721022 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv" podStartSLOduration=1.721003637 podStartE2EDuration="1.721003637s" podCreationTimestamp="2026-04-20 14:00:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:00:08.720187403 +0000 UTC m=+1765.315878480" watchObservedRunningTime="2026-04-20 14:00:08.721003637 +0000 UTC m=+1765.316694717" Apr 20 14:00:19.699829 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:19.699796 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-v5vlv" Apr 20 14:00:44.039932 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:44.039906 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 14:00:44.042909 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:44.042891 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 14:00:44.044935 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:44.044915 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 14:00:44.047748 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:00:44.047731 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 14:01:34.372975 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:34.372935 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-76f7db8cbd-fbrgf_52a32186-8c5b-49a7-b14c-232ae9e04531/authorino/0.log" Apr 20 14:01:38.430419 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:38.430387 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-74df864bdc-7jpsf_82fe8d2e-0333-4462-aed7-6444cc682cff/maas-api/0.log" Apr 20 14:01:38.549055 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:38.549018 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-67b98965d5-msrm8_26e2183b-e192-4c7c-829d-503b4c066515/manager/0.log" Apr 20 14:01:39.049809 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:39.049783 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-8cd4c57cb-zqkmf_10cde495-20e7-4979-9d71-c3cf9f4b00f3/manager/0.log" Apr 20 14:01:39.934382 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:39.934357 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6_657232aa-0963-4c20-8a39-ec48e4daf9c3/util/0.log" Apr 20 14:01:39.940169 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:39.940136 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6_657232aa-0963-4c20-8a39-ec48e4daf9c3/pull/0.log" Apr 20 14:01:39.945631 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:39.945604 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6_657232aa-0963-4c20-8a39-ec48e4daf9c3/extract/0.log" Apr 20 14:01:40.056305 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:40.056278 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l_adf98efc-4526-4e92-9a43-65e8dcb62815/util/0.log" Apr 20 14:01:40.066883 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:40.066856 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l_adf98efc-4526-4e92-9a43-65e8dcb62815/pull/0.log" Apr 20 14:01:40.074562 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:40.074543 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l_adf98efc-4526-4e92-9a43-65e8dcb62815/extract/0.log" Apr 20 14:01:40.187318 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:40.187233 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg_78ed81a2-6689-4285-ba75-0dc159bbc2de/util/0.log" Apr 20 14:01:40.193157 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:40.193122 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg_78ed81a2-6689-4285-ba75-0dc159bbc2de/pull/0.log" Apr 20 14:01:40.198954 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:40.198936 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg_78ed81a2-6689-4285-ba75-0dc159bbc2de/extract/0.log" Apr 20 14:01:40.316852 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:40.316827 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8_1ceaf24b-e567-4745-b221-00135c2089ba/util/0.log" Apr 20 14:01:40.323034 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:40.323017 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8_1ceaf24b-e567-4745-b221-00135c2089ba/pull/0.log" Apr 20 14:01:40.329060 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:40.329026 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8_1ceaf24b-e567-4745-b221-00135c2089ba/extract/0.log" Apr 20 14:01:40.442072 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:40.441994 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-76f7db8cbd-fbrgf_52a32186-8c5b-49a7-b14c-232ae9e04531/authorino/0.log" Apr 20 14:01:40.674783 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:40.674756 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-blmp5_90a9993b-28b8-4230-af21-0623d9670090/manager/0.log" Apr 20 14:01:40.902946 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:40.902921 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-k5wnq_ac6b711e-d59d-469a-b1e5-276940ec1a43/registry-server/0.log" Apr 20 14:01:41.022403 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:41.022371 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-v5vlv_b8d8629a-88a8-4fdf-91a3-1a0cf51a4119/manager/0.log" Apr 20 14:01:41.250122 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:41.250047 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-fglm2_f731df91-8b8d-4fa5-9e6e-8486c58fbcaa/manager/0.log" Apr 20 14:01:41.596499 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:41.596467 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn_f6293098-abf3-4a59-b6d9-be0f73a7ef51/istio-proxy/0.log" Apr 20 14:01:42.162013 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:42.161985 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-67666b9c78-x7c4n_a70ca6b1-f55d-4081-b09f-dd5454b489d3/router/0.log" Apr 20 14:01:42.499335 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:42.499257 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-7twff_3f18fcad-736b-448e-8a01-e59a0fd9fb22/storage-initializer/0.log" Apr 20 14:01:42.506889 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:42.506867 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-7twff_3f18fcad-736b-448e-8a01-e59a0fd9fb22/main/0.log" Apr 20 14:01:42.736907 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:42.736880 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-hsjrj_194b180e-2fed-4be0-ad56-9b423e81f5ea/storage-initializer/0.log" Apr 20 14:01:42.743101 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:42.743081 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-hsjrj_194b180e-2fed-4be0-ad56-9b423e81f5ea/main/0.log" Apr 20 14:01:50.415546 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:50.415519 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5w2mr_3b76b883-7171-4efb-b2ac-fc558e9fdf79/global-pull-secret-syncer/0.log" Apr 20 14:01:50.572409 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:50.572378 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tm5fj_a7673b95-6e38-4e6e-84a1-c083cd4e6356/konnectivity-agent/0.log" Apr 20 14:01:50.674488 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:50.674405 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-144.ec2.internal_1823e330d15f0fe92f9823b1c0261d30/haproxy/0.log" Apr 20 14:01:54.449886 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:54.449804 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6_657232aa-0963-4c20-8a39-ec48e4daf9c3/extract/0.log" Apr 20 14:01:54.473611 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:54.473584 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6_657232aa-0963-4c20-8a39-ec48e4daf9c3/util/0.log" Apr 20 14:01:54.499402 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:54.499371 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759bjst6_657232aa-0963-4c20-8a39-ec48e4daf9c3/pull/0.log" Apr 20 14:01:54.528048 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:54.528020 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l_adf98efc-4526-4e92-9a43-65e8dcb62815/extract/0.log" Apr 20 14:01:54.557287 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:54.557262 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l_adf98efc-4526-4e92-9a43-65e8dcb62815/util/0.log" Apr 20 14:01:54.579981 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:54.579955 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0sbq4l_adf98efc-4526-4e92-9a43-65e8dcb62815/pull/0.log" Apr 20 14:01:54.608547 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:54.608520 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg_78ed81a2-6689-4285-ba75-0dc159bbc2de/extract/0.log" Apr 20 14:01:54.643430 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:54.643404 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg_78ed81a2-6689-4285-ba75-0dc159bbc2de/util/0.log" Apr 20 14:01:54.694754 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:54.694730 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73qkwpg_78ed81a2-6689-4285-ba75-0dc159bbc2de/pull/0.log" Apr 20 14:01:54.766650 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:54.766578 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8_1ceaf24b-e567-4745-b221-00135c2089ba/extract/0.log" Apr 20 14:01:54.808616 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:54.808587 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8_1ceaf24b-e567-4745-b221-00135c2089ba/util/0.log" Apr 20 14:01:54.833468 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:54.833444 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1q67s8_1ceaf24b-e567-4745-b221-00135c2089ba/pull/0.log" Apr 20 14:01:54.987250 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:54.987217 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-76f7db8cbd-fbrgf_52a32186-8c5b-49a7-b14c-232ae9e04531/authorino/0.log" Apr 20 14:01:55.036457 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:55.036434 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-blmp5_90a9993b-28b8-4230-af21-0623d9670090/manager/0.log" Apr 20 14:01:55.101936 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:55.101907 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-k5wnq_ac6b711e-d59d-469a-b1e5-276940ec1a43/registry-server/0.log" Apr 20 14:01:55.128746 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:55.128715 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-v5vlv_b8d8629a-88a8-4fdf-91a3-1a0cf51a4119/manager/0.log" Apr 20 14:01:55.202157 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:55.202123 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-fglm2_f731df91-8b8d-4fa5-9e6e-8486c58fbcaa/manager/0.log" Apr 20 14:01:56.895827 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:56.895800 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-vtms5_27fd6d03-d487-4763-a29e-c24f39dbeb32/cluster-monitoring-operator/0.log" Apr 20 14:01:57.062197 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:57.062169 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gx9q4_0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5/node-exporter/0.log" Apr 20 14:01:57.081615 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:57.081593 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gx9q4_0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5/kube-rbac-proxy/0.log" Apr 20 14:01:57.101791 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:57.101766 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gx9q4_0dc9fc48-225a-4d0f-80cf-e171c1dfe6d5/init-textfile/0.log" Apr 20 14:01:58.795728 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:58.795699 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r"] Apr 20 14:01:58.799573 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:58.799550 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:58.801859 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:58.801827 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7jq8s\"/\"openshift-service-ca.crt\"" Apr 20 14:01:58.801980 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:58.801905 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7jq8s\"/\"kube-root-ca.crt\"" Apr 20 14:01:58.801980 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:58.801950 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7jq8s\"/\"default-dockercfg-w22fr\"" Apr 20 14:01:58.805706 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:58.805687 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r"] Apr 20 14:01:58.809896 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:58.809860 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-b5jqm_65ad83cd-9d60-4721-b86e-87436c3f0696/networking-console-plugin/0.log" Apr 20 14:01:58.934503 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:58.934467 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e5c2274b-cae1-4f3c-b24d-c25ce34f1c76-sys\") pod \"perf-node-gather-daemonset-lc69r\" (UID: \"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:58.934503 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:58.934509 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8npb6\" (UniqueName: \"kubernetes.io/projected/e5c2274b-cae1-4f3c-b24d-c25ce34f1c76-kube-api-access-8npb6\") pod \"perf-node-gather-daemonset-lc69r\" (UID: \"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:58.934727 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:58.934539 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e5c2274b-cae1-4f3c-b24d-c25ce34f1c76-podres\") pod \"perf-node-gather-daemonset-lc69r\" (UID: \"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:58.934727 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:58.934581 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e5c2274b-cae1-4f3c-b24d-c25ce34f1c76-lib-modules\") pod \"perf-node-gather-daemonset-lc69r\" (UID: \"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:58.934727 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:58.934606 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e5c2274b-cae1-4f3c-b24d-c25ce34f1c76-proc\") pod \"perf-node-gather-daemonset-lc69r\" (UID: \"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:59.036070 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.036037 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e5c2274b-cae1-4f3c-b24d-c25ce34f1c76-proc\") pod \"perf-node-gather-daemonset-lc69r\" (UID: \"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:59.036268 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.036127 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e5c2274b-cae1-4f3c-b24d-c25ce34f1c76-sys\") pod \"perf-node-gather-daemonset-lc69r\" (UID: \"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:59.036268 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.036188 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e5c2274b-cae1-4f3c-b24d-c25ce34f1c76-proc\") pod \"perf-node-gather-daemonset-lc69r\" (UID: \"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:59.036268 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.036188 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8npb6\" (UniqueName: \"kubernetes.io/projected/e5c2274b-cae1-4f3c-b24d-c25ce34f1c76-kube-api-access-8npb6\") pod \"perf-node-gather-daemonset-lc69r\" (UID: \"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:59.036268 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.036249 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e5c2274b-cae1-4f3c-b24d-c25ce34f1c76-sys\") pod \"perf-node-gather-daemonset-lc69r\" (UID: \"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:59.036439 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.036254 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e5c2274b-cae1-4f3c-b24d-c25ce34f1c76-podres\") pod \"perf-node-gather-daemonset-lc69r\" (UID: \"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:59.036439 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.036307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e5c2274b-cae1-4f3c-b24d-c25ce34f1c76-lib-modules\") pod \"perf-node-gather-daemonset-lc69r\" (UID: \"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:59.036439 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.036353 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e5c2274b-cae1-4f3c-b24d-c25ce34f1c76-podres\") pod \"perf-node-gather-daemonset-lc69r\" (UID: \"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:59.036439 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.036405 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e5c2274b-cae1-4f3c-b24d-c25ce34f1c76-lib-modules\") pod \"perf-node-gather-daemonset-lc69r\" (UID: \"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:59.045399 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.045364 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8npb6\" (UniqueName: \"kubernetes.io/projected/e5c2274b-cae1-4f3c-b24d-c25ce34f1c76-kube-api-access-8npb6\") pod \"perf-node-gather-daemonset-lc69r\" (UID: \"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:59.111455 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.111374 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:01:59.236317 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.236291 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r"] Apr 20 14:01:59.238094 ip-10-0-142-144 kubenswrapper[2573]: W0420 14:01:59.238060 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode5c2274b_cae1_4f3c_b24d_c25ce34f1c76.slice/crio-7fd958954c31716f8935dc36b1e2efd1bceed30e6605a5cb00b8526ceafa5fd0 WatchSource:0}: Error finding container 7fd958954c31716f8935dc36b1e2efd1bceed30e6605a5cb00b8526ceafa5fd0: Status 404 returned error can't find the container with id 7fd958954c31716f8935dc36b1e2efd1bceed30e6605a5cb00b8526ceafa5fd0 Apr 20 14:01:59.372911 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.372825 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/2.log" Apr 20 14:01:59.377102 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.377080 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rtbvg_409c02a3-0a51-4fe6-813b-cc03f7497104/console-operator/3.log" Apr 20 14:01:59.852857 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.852828 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d4d8b4658-mbq2h_f2c32770-7d05-4a4f-83e9-7137693639ad/console/0.log" Apr 20 14:01:59.882614 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:01:59.882585 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-b7824_248d400c-8b1e-42d8-a2ec-ba381005c2c7/download-server/0.log" Apr 20 14:02:00.127839 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:00.127760 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" event={"ID":"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76","Type":"ContainerStarted","Data":"54a13ffcc259a5eca8bb9f04d50cc890375acbf6c708c9901acda55e9c47dfbd"} Apr 20 14:02:00.127839 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:00.127794 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" event={"ID":"e5c2274b-cae1-4f3c-b24d-c25ce34f1c76","Type":"ContainerStarted","Data":"7fd958954c31716f8935dc36b1e2efd1bceed30e6605a5cb00b8526ceafa5fd0"} Apr 20 14:02:00.128020 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:00.127873 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:02:00.147196 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:00.147131 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" podStartSLOduration=2.147117015 podStartE2EDuration="2.147117015s" podCreationTimestamp="2026-04-20 14:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:02:00.144928239 +0000 UTC m=+1876.740619316" watchObservedRunningTime="2026-04-20 14:02:00.147117015 +0000 UTC m=+1876.742808090" Apr 20 14:02:01.244940 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:01.244913 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p7fbq_8d3811b3-7e75-4345-b591-277c5aecb5fd/dns/0.log" Apr 20 14:02:01.264374 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:01.264347 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p7fbq_8d3811b3-7e75-4345-b591-277c5aecb5fd/kube-rbac-proxy/0.log" Apr 20 14:02:01.335672 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:01.335637 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4qdsh_41710337-4f82-4bb8-abe7-f7a5cc3d9802/dns-node-resolver/0.log" Apr 20 14:02:01.867584 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:01.867555 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kmm7p_5e473fb6-5d6c-47e5-9f17-d87b134e316e/node-ca/0.log" Apr 20 14:02:02.709672 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:02.709592 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf5p9vn_f6293098-abf3-4a59-b6d9-be0f73a7ef51/istio-proxy/0.log" Apr 20 14:02:02.912303 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:02.912276 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-67666b9c78-x7c4n_a70ca6b1-f55d-4081-b09f-dd5454b489d3/router/0.log" Apr 20 14:02:03.423872 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:03.423844 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hdgh8_8e1662ff-63f6-4f08-9e96-75f038878584/serve-healthcheck-canary/0.log" Apr 20 14:02:03.890463 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:03.890430 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h754j_824d27e0-a488-4bf8-badb-5a72756a911c/kube-rbac-proxy/0.log" Apr 20 14:02:03.910227 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:03.910199 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h754j_824d27e0-a488-4bf8-badb-5a72756a911c/exporter/0.log" Apr 20 14:02:03.930714 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:03.930685 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h754j_824d27e0-a488-4bf8-badb-5a72756a911c/extractor/0.log" Apr 20 14:02:05.944254 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:05.944227 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-74df864bdc-7jpsf_82fe8d2e-0333-4462-aed7-6444cc682cff/maas-api/0.log" Apr 20 14:02:05.979805 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:05.978126 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-67b98965d5-msrm8_26e2183b-e192-4c7c-829d-503b4c066515/manager/0.log" Apr 20 14:02:06.111223 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:06.111195 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-8cd4c57cb-zqkmf_10cde495-20e7-4979-9d71-c3cf9f4b00f3/manager/0.log" Apr 20 14:02:06.140700 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:06.140675 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-lc69r" Apr 20 14:02:07.301393 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:07.301360 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-59c6b8cc85-9dkks_94bbdb99-f7ef-4514-82dd-4850b5c9ec5f/manager/0.log" Apr 20 14:02:07.353528 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:07.353490 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-bnwlr_50e2cde6-7239-4b66-b403-a8ba79225068/openshift-lws-operator/0.log" Apr 20 14:02:11.734658 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:11.734630 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-kl8nw_c643cd33-a7a0-4649-8ea1-1c6cc7ad1130/migrator/0.log" Apr 20 14:02:11.755583 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:11.755555 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-kl8nw_c643cd33-a7a0-4649-8ea1-1c6cc7ad1130/graceful-termination/0.log" Apr 20 14:02:12.112453 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:12.112423 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-j5mhj_689d9ed8-d3dd-4b84-a93f-cc84672538b6/kube-storage-version-migrator-operator/1.log" Apr 20 14:02:12.113385 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:12.113368 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-j5mhj_689d9ed8-d3dd-4b84-a93f-cc84672538b6/kube-storage-version-migrator-operator/0.log" Apr 20 14:02:13.071155 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:13.071108 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2s69b_6f769d40-1c0a-4957-8061-892b0f5e5266/kube-multus/0.log" Apr 20 14:02:13.264035 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:13.264005 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g88jt_29c60f5b-f12d-43ec-a794-f2abbe748308/kube-multus-additional-cni-plugins/0.log" Apr 20 14:02:13.294311 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:13.294284 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g88jt_29c60f5b-f12d-43ec-a794-f2abbe748308/egress-router-binary-copy/0.log" Apr 20 14:02:13.315529 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:13.315503 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g88jt_29c60f5b-f12d-43ec-a794-f2abbe748308/cni-plugins/0.log" Apr 20 14:02:13.338098 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:13.338037 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g88jt_29c60f5b-f12d-43ec-a794-f2abbe748308/bond-cni-plugin/0.log" Apr 20 14:02:13.359590 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:13.359572 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g88jt_29c60f5b-f12d-43ec-a794-f2abbe748308/routeoverride-cni/0.log" Apr 20 14:02:13.382228 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:13.382207 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g88jt_29c60f5b-f12d-43ec-a794-f2abbe748308/whereabouts-cni-bincopy/0.log" Apr 20 14:02:13.404216 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:13.404192 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g88jt_29c60f5b-f12d-43ec-a794-f2abbe748308/whereabouts-cni/0.log" Apr 20 14:02:13.712901 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:13.712822 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5w9cl_0de99a89-e8e5-491a-90c3-5c371ed6705f/network-metrics-daemon/0.log" Apr 20 14:02:13.730786 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:13.730749 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5w9cl_0de99a89-e8e5-491a-90c3-5c371ed6705f/kube-rbac-proxy/0.log" Apr 20 14:02:14.521294 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:14.521251 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-controller/0.log" Apr 20 14:02:14.536862 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:14.536834 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/0.log" Apr 20 14:02:14.544615 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:14.544596 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovn-acl-logging/1.log" Apr 20 14:02:14.561457 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:14.561439 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/kube-rbac-proxy-node/0.log" Apr 20 14:02:14.583769 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:14.583749 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 14:02:14.609881 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:14.609814 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/northd/0.log" Apr 20 14:02:14.629426 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:14.629405 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/nbdb/0.log" Apr 20 14:02:14.651661 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:14.651643 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/sbdb/0.log" Apr 20 14:02:14.761909 ip-10-0-142-144 kubenswrapper[2573]: I0420 14:02:14.761881 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-drksq_932d1d43-95d3-476c-b3d2-da80b4fcf711/ovnkube-controller/0.log"