Apr 21 14:53:20.184240 ip-10-0-130-121 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 14:53:20.184255 ip-10-0-130-121 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 14:53:20.184265 ip-10-0-130-121 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 14:53:20.184611 ip-10-0-130-121 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 14:53:30.288253 ip-10-0-130-121 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 14:53:30.288276 ip-10-0-130-121 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f7a30b4f7c0841f295ceb384103a8355 -- Apr 21 14:55:53.750523 ip-10-0-130-121 systemd[1]: Starting Kubernetes Kubelet... Apr 21 14:55:54.192289 ip-10-0-130-121 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 14:55:54.192289 ip-10-0-130-121 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 14:55:54.192289 ip-10-0-130-121 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 14:55:54.192289 ip-10-0-130-121 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 14:55:54.192289 ip-10-0-130-121 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 14:55:54.194525 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.194429 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 14:55:54.196802 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196787 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:55:54.196837 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196803 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:55:54.196837 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196807 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:55:54.196837 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196811 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:55:54.196837 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196814 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:55:54.196837 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196817 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:55:54.196837 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196820 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:55:54.196837 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196823 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:55:54.196837 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196826 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:55:54.196837 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196828 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:55:54.196837 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196831 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:55:54.196837 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196834 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:55:54.196837 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196842 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196845 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196848 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196851 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196854 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196856 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196859 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196861 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196864 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196867 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196869 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196872 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196874 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196877 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196880 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196883 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196888 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196891 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196893 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:55:54.197119 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196896 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196898 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196901 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196904 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196906 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196909 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196911 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196914 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196916 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196927 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196930 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196933 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196936 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196940 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196945 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196950 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196953 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196956 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196959 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:55:54.197582 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196961 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196965 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196967 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196970 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196973 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196976 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196978 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196981 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196983 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196986 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196989 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196992 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196995 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.196997 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197000 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197002 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197005 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197008 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197011 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197013 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:55:54.198147 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197016 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197018 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197021 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197024 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197026 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197030 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197033 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197036 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197039 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197041 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197044 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197047 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197049 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197052 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197055 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197057 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197470 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197477 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197480 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:55:54.198658 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197483 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197486 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197491 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197494 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197497 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197499 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197502 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197522 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197526 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197529 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197532 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197537 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197540 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197542 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197545 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197548 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197551 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197553 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197556 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197560 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:55:54.199174 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197563 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197565 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197568 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197571 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197573 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197576 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197579 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197582 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197584 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197587 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197590 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197592 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197595 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197598 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197601 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197604 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197606 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197609 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197611 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197614 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:55:54.199665 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197616 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197619 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197621 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197624 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197627 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197629 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197632 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197635 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197637 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197640 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197642 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197646 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197648 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197651 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197654 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197656 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197659 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197661 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197666 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:55:54.200159 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197669 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197672 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197675 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197685 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197689 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197692 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197695 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197698 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197700 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197703 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197706 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197708 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197711 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197713 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197716 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197719 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197722 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197725 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197727 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197731 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:55:54.200638 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197734 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197736 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197739 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.197741 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198513 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198523 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198529 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198544 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198549 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198553 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198558 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198562 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198565 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198568 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198572 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198575 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198578 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198581 2576 flags.go:64] FLAG: --cgroup-root="" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198584 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198587 2576 flags.go:64] FLAG: --client-ca-file="" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198590 2576 flags.go:64] FLAG: --cloud-config="" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198593 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198597 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198602 2576 flags.go:64] FLAG: --cluster-domain="" Apr 21 14:55:54.201117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198605 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198609 2576 flags.go:64] FLAG: --config-dir="" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198612 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198615 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198619 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198622 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198626 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198629 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198632 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198635 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198638 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198642 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198645 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198650 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198653 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198656 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198659 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198662 2576 flags.go:64] FLAG: --enable-server="true" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198665 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198670 2576 flags.go:64] FLAG: --event-burst="100" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198674 2576 flags.go:64] FLAG: --event-qps="50" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198677 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198680 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198683 2576 flags.go:64] FLAG: --eviction-hard="" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198688 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 14:55:54.201710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198691 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198694 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198697 2576 flags.go:64] FLAG: --eviction-soft="" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198700 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198703 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198706 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198709 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198712 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198720 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198723 2576 flags.go:64] FLAG: --feature-gates="" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198727 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198730 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198733 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198737 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198740 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198743 2576 flags.go:64] FLAG: --help="false" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198746 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198749 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198752 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198755 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198759 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198762 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198765 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 14:55:54.202327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198768 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198771 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198774 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198777 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198780 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198783 2576 flags.go:64] FLAG: --kube-reserved="" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198786 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198789 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198791 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198794 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198797 2576 flags.go:64] FLAG: --lock-file="" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198800 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198803 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198806 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198811 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198814 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198818 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198823 2576 flags.go:64] FLAG: --logging-format="text" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198826 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198830 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198833 2576 flags.go:64] FLAG: --manifest-url="" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198837 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198841 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198845 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198854 2576 flags.go:64] FLAG: --max-pods="110" Apr 21 14:55:54.202905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198857 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198860 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198863 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198866 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198869 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198872 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198875 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198883 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198886 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198889 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198892 2576 flags.go:64] FLAG: --pod-cidr="" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198895 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198901 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198904 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198907 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198910 2576 flags.go:64] FLAG: --port="10250" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198913 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198916 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00e90ebd1ccf809e9" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198919 2576 flags.go:64] FLAG: --qos-reserved="" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198922 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198925 2576 flags.go:64] FLAG: --register-node="true" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198929 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198932 2576 flags.go:64] FLAG: --register-with-taints="" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198935 2576 flags.go:64] FLAG: --registry-burst="10" Apr 21 14:55:54.203526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198939 2576 flags.go:64] FLAG: --registry-qps="5" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198943 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198946 2576 flags.go:64] FLAG: --reserved-memory="" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198950 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198954 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198957 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198960 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198963 2576 flags.go:64] FLAG: --runonce="false" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198966 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198969 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198972 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198975 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198978 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198981 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198984 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198987 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198990 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198993 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198996 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.198999 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199002 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199005 2576 flags.go:64] FLAG: --system-cgroups="" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199008 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199013 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199016 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 21 14:55:54.204137 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199018 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199023 2576 flags.go:64] FLAG: --tls-min-version="" Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199026 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199028 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199031 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199034 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199037 2576 flags.go:64] FLAG: --v="2" Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199041 2576 flags.go:64] FLAG: --version="false" Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199047 2576 flags.go:64] FLAG: --vmodule="" Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199052 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199055 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199198 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199203 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199208 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199211 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199214 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199217 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199220 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199223 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199226 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199228 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199231 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:55:54.204786 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199234 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199253 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199257 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199261 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199264 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199267 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199270 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199273 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199276 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199278 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199281 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199284 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199286 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199289 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199291 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199294 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199297 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199300 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199303 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:55:54.205347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199306 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199309 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199314 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199316 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199319 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199322 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199324 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199327 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199329 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199332 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199334 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199337 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199339 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199342 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199344 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199347 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199349 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199352 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199354 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199357 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:55:54.205827 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199359 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199362 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199365 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199367 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199370 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199373 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199375 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199377 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199380 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199382 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199385 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199388 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199391 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199394 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199398 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199401 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199403 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199406 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199408 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199411 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:55:54.206607 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199414 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:55:54.207204 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199416 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:55:54.207204 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199419 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:55:54.207204 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199421 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:55:54.207204 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199424 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:55:54.207204 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199428 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:55:54.207204 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199431 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:55:54.207204 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199434 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:55:54.207204 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199436 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:55:54.207204 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199439 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:55:54.207204 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199441 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:55:54.207204 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199444 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:55:54.207204 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199447 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:55:54.207204 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199449 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:55:54.207204 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199452 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:55:54.207204 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.199454 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:55:54.207753 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.199459 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 14:55:54.207753 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.207394 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 14:55:54.207753 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.207420 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 14:55:54.207753 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207711 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:55:54.207753 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207727 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:55:54.207753 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207731 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:55:54.207753 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207734 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:55:54.207753 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207737 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:55:54.207753 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207740 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:55:54.207753 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207744 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:55:54.207753 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207750 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:55:54.207753 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207755 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:55:54.207753 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207759 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:55:54.207753 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207762 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207765 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207769 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207772 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207774 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207777 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207782 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207786 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207789 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207791 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207795 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207797 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207800 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207803 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207805 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207808 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207811 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207814 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207817 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207819 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:55:54.208164 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207822 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207829 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207832 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207834 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207837 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207839 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207842 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207845 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207847 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207850 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207852 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207855 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207858 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207861 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207864 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207867 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207870 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207872 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207875 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207878 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:55:54.208678 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207880 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207883 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207885 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207888 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207890 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207893 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207896 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207899 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207902 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207904 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207907 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207910 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207912 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207915 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207917 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207920 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207922 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207926 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207928 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207931 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:55:54.209166 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207933 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207936 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207939 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207941 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207944 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207947 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207950 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207952 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207956 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207959 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207961 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207964 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207967 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207969 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207972 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.207975 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:55:54.209676 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.207981 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208110 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208116 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208119 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208122 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208125 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208128 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208131 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208133 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208136 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208138 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208141 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208144 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208147 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208150 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208152 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208155 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208158 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208160 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:55:54.210063 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208163 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208165 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208168 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208171 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208176 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208179 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208182 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208185 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208187 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208190 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208192 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208195 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208197 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208200 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208202 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208205 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208207 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208210 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208213 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:55:54.210551 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208215 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208218 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208220 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208223 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208225 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208228 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208230 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208233 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208254 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208257 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208260 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208263 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208266 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208268 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208271 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208274 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208277 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208279 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208283 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208285 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:55:54.211005 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208288 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208291 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208293 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208296 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208299 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208301 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208304 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208306 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208309 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208313 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208316 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208319 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208322 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208325 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208327 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208330 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208332 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208334 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208337 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208340 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:55:54.211595 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208343 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:55:54.212068 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208346 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:55:54.212068 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208348 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:55:54.212068 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208351 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:55:54.212068 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208354 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:55:54.212068 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208356 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:55:54.212068 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208359 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:55:54.212068 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208362 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:55:54.212068 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:54.208364 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:55:54.212068 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.208369 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 14:55:54.212068 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.209124 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 14:55:54.213047 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.213033 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 14:55:54.214022 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.214010 2576 server.go:1019] "Starting client certificate rotation" Apr 21 14:55:54.214123 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.214106 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 14:55:54.214159 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.214143 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 14:55:54.238788 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.238767 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 14:55:54.243447 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.243425 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 14:55:54.255820 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.255798 2576 log.go:25] "Validated CRI v1 runtime API" Apr 21 14:55:54.263055 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.263039 2576 log.go:25] "Validated CRI v1 image API" Apr 21 14:55:54.264580 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.264564 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 14:55:54.266468 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.266447 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 14:55:54.267207 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.267185 2576 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b41c2b50-524d-40f1-b0e7-ea7860d46263:/dev/nvme0n1p4 b61d9500-0ecd-4b6c-9f73-67955c7b0053:/dev/nvme0n1p3] Apr 21 14:55:54.267297 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.267206 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 14:55:54.274989 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.274869 2576 manager.go:217] Machine: {Timestamp:2026-04-21 14:55:54.272953882 +0000 UTC m=+0.407931486 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101269 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2cc93fe7948a5784b06daba4edf467 SystemUUID:ec2cc93f-e794-8a57-84b0-6daba4edf467 BootID:f7a30b4f-7c08-41f2-95ce-b384103a8355 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:2a:a0:67:17:97 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:2a:a0:67:17:97 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:be:68:ad:fe:55:ab Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 14:55:54.274989 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.274977 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 14:55:54.275098 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.275063 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 14:55:54.275474 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.275453 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 14:55:54.275610 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.275477 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-121.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 14:55:54.275658 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.275620 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 14:55:54.275658 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.275629 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 14:55:54.275658 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.275644 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 14:55:54.276405 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.276394 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 14:55:54.277140 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.277131 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 21 14:55:54.277264 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.277255 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 14:55:54.279709 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.279700 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 21 14:55:54.279757 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.279713 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 14:55:54.279757 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.279724 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 14:55:54.279757 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.279733 2576 kubelet.go:397] "Adding apiserver pod source" Apr 21 14:55:54.279757 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.279742 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 14:55:54.280837 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.280825 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 14:55:54.280882 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.280843 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 14:55:54.284039 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.284022 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 14:55:54.285205 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.285193 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 14:55:54.286746 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.286732 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 14:55:54.286788 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.286756 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 14:55:54.286788 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.286766 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 14:55:54.286788 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.286774 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 14:55:54.286788 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.286784 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 14:55:54.286927 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.286792 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 14:55:54.286927 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.286801 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 14:55:54.286927 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.286809 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 14:55:54.286927 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.286820 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 14:55:54.286927 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.286830 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 14:55:54.286927 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.286853 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 14:55:54.286927 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.286865 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 14:55:54.288172 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.288148 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lcrvw" Apr 21 14:55:54.288312 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.288301 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 14:55:54.288356 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.288314 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 14:55:54.292049 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.292036 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 14:55:54.292104 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.292074 2576 server.go:1295] "Started kubelet" Apr 21 14:55:54.292155 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.292129 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 14:55:54.292281 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.292223 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 14:55:54.292342 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.292311 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 14:55:54.292942 ip-10-0-130-121 systemd[1]: Started Kubernetes Kubelet. Apr 21 14:55:54.293362 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.293345 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 14:55:54.294301 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.294284 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 21 14:55:54.295091 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.295071 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lcrvw" Apr 21 14:55:54.295750 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.295732 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-121.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 14:55:54.296467 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.296438 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-121.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 14:55:54.296552 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.296438 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 14:55:54.300275 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.300257 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 14:55:54.300275 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.300271 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 14:55:54.300826 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.300802 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 14:55:54.300826 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.300828 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 14:55:54.300949 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.300873 2576 factory.go:55] Registering systemd factory Apr 21 14:55:54.300949 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.300892 2576 factory.go:223] Registration of the systemd container factory successfully Apr 21 14:55:54.300949 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.300923 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 14:55:54.301121 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.300985 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 21 14:55:54.301121 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.300992 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 21 14:55:54.301121 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.301095 2576 factory.go:153] Registering CRI-O factory Apr 21 14:55:54.301121 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.301101 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-121.ec2.internal\" not found" Apr 21 14:55:54.301121 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.301108 2576 factory.go:223] Registration of the crio container factory successfully Apr 21 14:55:54.301359 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.301183 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 14:55:54.301359 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.301209 2576 factory.go:103] Registering Raw factory Apr 21 14:55:54.301359 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.301221 2576 manager.go:1196] Started watching for new ooms in manager Apr 21 14:55:54.301701 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.301677 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 14:55:54.301795 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.301736 2576 manager.go:319] Starting recovery of all containers Apr 21 14:55:54.308798 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.308626 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:54.313346 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.313278 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-121.ec2.internal\" not found" node="ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.314382 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.314366 2576 manager.go:324] Recovery completed Apr 21 14:55:54.318489 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.318475 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:54.320910 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.320894 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:54.320981 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.320927 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:54.320981 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.320939 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:54.321350 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.321336 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 14:55:54.321419 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.321351 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 14:55:54.321419 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.321371 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 21 14:55:54.323495 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.323483 2576 policy_none.go:49] "None policy: Start" Apr 21 14:55:54.323531 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.323502 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 14:55:54.323531 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.323512 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 21 14:55:54.358877 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.358864 2576 manager.go:341] "Starting Device Plugin manager" Apr 21 14:55:54.375235 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.358892 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 14:55:54.375235 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.358903 2576 server.go:85] "Starting device plugin registration server" Apr 21 14:55:54.375235 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.359143 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 14:55:54.375235 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.359155 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 14:55:54.375235 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.359232 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 14:55:54.375235 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.359353 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 14:55:54.375235 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.359361 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 14:55:54.375235 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.359834 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 14:55:54.375235 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.359865 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-121.ec2.internal\" not found" Apr 21 14:55:54.444148 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.444061 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 14:55:54.445485 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.445469 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 14:55:54.445577 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.445504 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 14:55:54.445577 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.445540 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 14:55:54.445577 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.445552 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 14:55:54.445704 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.445596 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 14:55:54.448149 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.448124 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:54.460094 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.460076 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:54.461108 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.461094 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:54.461167 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.461125 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:54.461167 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.461142 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:54.461229 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.461168 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.467920 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.467906 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.467968 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.467929 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-121.ec2.internal\": node \"ip-10-0-130-121.ec2.internal\" not found" Apr 21 14:55:54.489479 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.489454 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-121.ec2.internal\" not found" Apr 21 14:55:54.545962 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.545912 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-121.ec2.internal"] Apr 21 14:55:54.546053 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.546009 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:54.546997 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.546982 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:54.547068 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.547011 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:54.547068 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.547025 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:54.548106 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.548094 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:54.548262 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.548234 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.548305 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.548276 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:54.548855 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.548838 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:54.548855 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.548850 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:54.548949 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.548863 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:54.548949 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.548870 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:54.548949 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.548873 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:54.548949 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.548879 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:54.549886 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.549872 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.549957 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.549897 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:54.550626 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.550610 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:54.550686 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.550634 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:54.550686 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.550648 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:54.583812 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.583787 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-121.ec2.internal\" not found" node="ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.588451 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.588435 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-121.ec2.internal\" not found" node="ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.590147 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.590136 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-121.ec2.internal\" not found" Apr 21 14:55:54.690911 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.690887 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-121.ec2.internal\" not found" Apr 21 14:55:54.702184 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.702133 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e294e9cc44aca557b7c9191559850248-config\") pod \"kube-apiserver-proxy-ip-10-0-130-121.ec2.internal\" (UID: \"e294e9cc44aca557b7c9191559850248\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.702184 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.702169 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/16ec0454ffec8b367ca76459ebc7c451-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal\" (UID: \"16ec0454ffec8b367ca76459ebc7c451\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.702348 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.702192 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16ec0454ffec8b367ca76459ebc7c451-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal\" (UID: \"16ec0454ffec8b367ca76459ebc7c451\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.791533 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.791498 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-121.ec2.internal\" not found" Apr 21 14:55:54.802795 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.802760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/16ec0454ffec8b367ca76459ebc7c451-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal\" (UID: \"16ec0454ffec8b367ca76459ebc7c451\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.802877 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.802803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16ec0454ffec8b367ca76459ebc7c451-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal\" (UID: \"16ec0454ffec8b367ca76459ebc7c451\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.802877 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.802823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e294e9cc44aca557b7c9191559850248-config\") pod \"kube-apiserver-proxy-ip-10-0-130-121.ec2.internal\" (UID: \"e294e9cc44aca557b7c9191559850248\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.802877 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.802851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/16ec0454ffec8b367ca76459ebc7c451-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal\" (UID: \"16ec0454ffec8b367ca76459ebc7c451\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.802877 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.802868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e294e9cc44aca557b7c9191559850248-config\") pod \"kube-apiserver-proxy-ip-10-0-130-121.ec2.internal\" (UID: \"e294e9cc44aca557b7c9191559850248\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.802996 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.802880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16ec0454ffec8b367ca76459ebc7c451-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal\" (UID: \"16ec0454ffec8b367ca76459ebc7c451\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.885970 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.885937 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.890488 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:54.890466 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-121.ec2.internal" Apr 21 14:55:54.891772 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.891750 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-121.ec2.internal\" not found" Apr 21 14:55:54.992395 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:54.992302 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-121.ec2.internal\" not found" Apr 21 14:55:55.092819 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:55.092788 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-121.ec2.internal\" not found" Apr 21 14:55:55.193356 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:55.193327 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-121.ec2.internal\" not found" Apr 21 14:55:55.213797 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.213772 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 14:55:55.213929 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.213912 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 14:55:55.213967 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.213938 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 14:55:55.265219 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.265169 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:55.280728 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.280707 2576 apiserver.go:52] "Watching apiserver" Apr 21 14:55:55.291703 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.291677 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 14:55:55.293310 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.293286 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf","openshift-cluster-node-tuning-operator/tuned-6bjz7","openshift-dns/node-resolver-kpdvd","openshift-network-diagnostics/network-check-target-zppc2","openshift-ovn-kubernetes/ovnkube-node-th598","kube-system/konnectivity-agent-jpxzc","openshift-image-registry/node-ca-7h2f9","openshift-multus/multus-5q2lt","openshift-multus/multus-additional-cni-plugins-sldvt","openshift-multus/network-metrics-daemon-ktgkr","openshift-network-operator/iptables-alerter-rw6cl"] Apr 21 14:55:55.295642 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.295626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.296587 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.296551 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.297593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.297574 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kpdvd" Apr 21 14:55:55.297692 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.297656 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:55:55.297748 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:55.297725 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:55:55.297884 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.297866 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gw772\"" Apr 21 14:55:55.298037 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.298022 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 14:55:55.298153 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.298112 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 14:55:55.298153 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.298136 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 14:50:54 +0000 UTC" deadline="2028-01-17 06:18:08.254725682 +0000 UTC" Apr 21 14:55:55.298276 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.298154 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15255h22m12.956574112s" Apr 21 14:55:55.298524 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.298508 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 14:55:55.298608 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.298593 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-584db\"" Apr 21 14:55:55.298652 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.298632 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 14:55:55.298880 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.298862 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.298988 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.298970 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:55:55.299673 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.299651 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-shmt9\"" Apr 21 14:55:55.299673 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.299664 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 14:55:55.299897 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.299860 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jpxzc" Apr 21 14:55:55.300140 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.300123 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 14:55:55.300441 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.300424 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 14:55:55.300841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.300826 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 14:55:55.300970 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.300950 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7h2f9" Apr 21 14:55:55.301090 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.301074 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-121.ec2.internal" Apr 21 14:55:55.301719 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.301676 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 14:55:55.301719 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.301712 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-n5rp8\"" Apr 21 14:55:55.301912 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.301857 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 14:55:55.301972 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.301942 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 14:55:55.302063 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.302040 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vqbxz\"" Apr 21 14:55:55.302285 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.302269 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.302418 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.302402 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 14:55:55.302538 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.302522 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 14:55:55.302881 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.302832 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 14:55:55.303144 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.303128 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 14:55:55.303467 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.303450 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.304559 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.304541 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:55:55.304647 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:55.304598 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:55:55.304729 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.304708 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dadbd785-1d07-45b6-868c-c95e20421c54-tmp-dir\") pod \"node-resolver-kpdvd\" (UID: \"dadbd785-1d07-45b6-868c-c95e20421c54\") " pod="openshift-dns/node-resolver-kpdvd" Apr 21 14:55:55.304772 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.304751 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-var-lib-openvswitch\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.304813 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.304779 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bd6e163c-77e0-40c8-8dc0-b521635bd265-ovn-node-metrics-cert\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.304813 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.304805 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h88b\" (UniqueName: \"kubernetes.io/projected/297ac21d-4aa7-488f-8f40-48d7b969036b-kube-api-access-2h88b\") pod \"node-ca-7h2f9\" (UID: \"297ac21d-4aa7-488f-8f40-48d7b969036b\") " pod="openshift-image-registry/node-ca-7h2f9" Apr 21 14:55:55.304904 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.304830 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-device-dir\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.304904 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.304872 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-systemd-units\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.304990 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.304929 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-run-netns\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.304990 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.304952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/297ac21d-4aa7-488f-8f40-48d7b969036b-host\") pod \"node-ca-7h2f9\" (UID: \"297ac21d-4aa7-488f-8f40-48d7b969036b\") " pod="openshift-image-registry/node-ca-7h2f9" Apr 21 14:55:55.305087 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.304998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-run-netns\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.305087 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305032 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j59sl\" (UniqueName: \"kubernetes.io/projected/91d8fda2-6bab-475f-bb2c-09e739e26078-kube-api-access-j59sl\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.305087 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-registration-dir\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.305261 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305085 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-etc-selinux\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.305261 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305109 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0c7e2c16-e24b-4449-ae17-3e6e83f0e900-konnectivity-ca\") pod \"konnectivity-agent-jpxzc\" (UID: \"0c7e2c16-e24b-4449-ae17-3e6e83f0e900\") " pod="kube-system/konnectivity-agent-jpxzc" Apr 21 14:55:55.305261 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-systemd\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.305261 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305160 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-run\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.305261 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305189 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/297ac21d-4aa7-488f-8f40-48d7b969036b-serviceca\") pod \"node-ca-7h2f9\" (UID: \"297ac21d-4aa7-488f-8f40-48d7b969036b\") " pod="openshift-image-registry/node-ca-7h2f9" Apr 21 14:55:55.305261 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305194 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 14:55:55.305497 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-multus-conf-dir\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.305497 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0c7e2c16-e24b-4449-ae17-3e6e83f0e900-agent-certs\") pod \"konnectivity-agent-jpxzc\" (UID: \"0c7e2c16-e24b-4449-ae17-3e6e83f0e900\") " pod="kube-system/konnectivity-agent-jpxzc" Apr 21 14:55:55.305497 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305406 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.305497 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-kubelet\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.305497 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305474 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-run-openvswitch\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.305672 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-system-cni-dir\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.305672 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-run-k8s-cni-cncf-io\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.305672 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-sysconfig\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.305672 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305597 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-sysctl-d\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.305672 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305622 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bd6e163c-77e0-40c8-8dc0-b521635bd265-ovnkube-config\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.305672 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305647 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/91d8fda2-6bab-475f-bb2c-09e739e26078-cni-binary-copy\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.305672 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305672 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/91d8fda2-6bab-475f-bb2c-09e739e26078-multus-daemon-config\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.305906 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305696 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-run-ovn\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.305906 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305733 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rw6cl" Apr 21 14:55:55.305906 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305721 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-modprobe-d\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.305906 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-kubernetes\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.305906 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-sysctl-conf\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.306160 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305918 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bd6e163c-77e0-40c8-8dc0-b521635bd265-env-overrides\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.306160 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2d8c\" (UniqueName: \"kubernetes.io/projected/bd6e163c-77e0-40c8-8dc0-b521635bd265-kube-api-access-h2d8c\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.306160 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.305978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dadbd785-1d07-45b6-868c-c95e20421c54-hosts-file\") pod \"node-resolver-kpdvd\" (UID: \"dadbd785-1d07-45b6-868c-c95e20421c54\") " pod="openshift-dns/node-resolver-kpdvd" Apr 21 14:55:55.306160 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306004 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nprfg\" (UniqueName: \"kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg\") pod \"network-check-target-zppc2\" (UID: \"cab0d6b8-24b3-4ee8-bf59-c526df4af70b\") " pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:55:55.306160 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306031 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-node-log\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.306160 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306056 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-log-socket\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.306160 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306104 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-run-ovn-kubernetes\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.306160 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306129 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-os-release\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.306160 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-run-multus-certs\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306182 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-run-systemd\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306197 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-cni-netd\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-multus-socket-dir-parent\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-sys-fs\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-slash\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-cni-bin\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306355 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-etc-kubernetes\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306378 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e80a0011-4086-4205-a1ef-92d35adf2717-tmp\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306403 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-var-lib-cni-multus\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-socket-dir\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-sys\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306476 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306489 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306506 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306466 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-var-lib-kubelet\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306566 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-etc-openvswitch\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306524 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-cnibin\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.306644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-var-lib-kubelet\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.307343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-host\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.307343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306684 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gftkf\" (UniqueName: \"kubernetes.io/projected/e0c80303-a3d0-4f47-8d58-c49be590b002-kube-api-access-gftkf\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.307343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306716 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e80a0011-4086-4205-a1ef-92d35adf2717-etc-tuned\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.307343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9vt7\" (UniqueName: \"kubernetes.io/projected/e80a0011-4086-4205-a1ef-92d35adf2717-kube-api-access-v9vt7\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.307343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-lib-modules\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.307343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.307343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306779 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-multus-cni-dir\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.307343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306819 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-var-lib-cni-bin\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.307343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f257\" (UniqueName: \"kubernetes.io/projected/dadbd785-1d07-45b6-868c-c95e20421c54-kube-api-access-8f257\") pod \"node-resolver-kpdvd\" (UID: \"dadbd785-1d07-45b6-868c-c95e20421c54\") " pod="openshift-dns/node-resolver-kpdvd" Apr 21 14:55:55.307343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306865 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bd6e163c-77e0-40c8-8dc0-b521635bd265-ovnkube-script-lib\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.307343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.306891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-hostroot\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.307343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.307179 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-79rjb\"" Apr 21 14:55:55.307343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.307189 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zsff7\"" Apr 21 14:55:55.307343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.307259 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 14:55:55.309341 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.309323 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rn6jn\"" Apr 21 14:55:55.309637 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.309534 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 14:55:55.309637 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.309559 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 14:55:55.309637 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.309565 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2znd6\"" Apr 21 14:55:55.309994 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.309971 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 14:55:55.310326 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.309561 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:55:55.310837 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.310813 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 14:55:55.312477 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.312454 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-121.ec2.internal"] Apr 21 14:55:55.313056 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.313040 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 14:55:55.313228 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.313216 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal" Apr 21 14:55:55.317312 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.317295 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 14:55:55.332269 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.332226 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 14:55:55.332378 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.332316 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal"] Apr 21 14:55:55.351259 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.351211 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4zb8g" Apr 21 14:55:55.383697 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.383657 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4zb8g" Apr 21 14:55:55.399373 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:55.399348 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16ec0454ffec8b367ca76459ebc7c451.slice/crio-2d3adc413bf7117bf0c9bf6d502bb58043fdcbd48f2461cbed2376e73211ec81 WatchSource:0}: Error finding container 2d3adc413bf7117bf0c9bf6d502bb58043fdcbd48f2461cbed2376e73211ec81: Status 404 returned error can't find the container with id 2d3adc413bf7117bf0c9bf6d502bb58043fdcbd48f2461cbed2376e73211ec81 Apr 21 14:55:55.399577 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:55.399557 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode294e9cc44aca557b7c9191559850248.slice/crio-c681f5e6086bb193ce33b9189920a6729d3d61a52b23df0f18d29340d8a9c37a WatchSource:0}: Error finding container c681f5e6086bb193ce33b9189920a6729d3d61a52b23df0f18d29340d8a9c37a: Status 404 returned error can't find the container with id c681f5e6086bb193ce33b9189920a6729d3d61a52b23df0f18d29340d8a9c37a Apr 21 14:55:55.402295 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.402268 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 14:55:55.403780 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.403761 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 14:55:55.407519 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e80a0011-4086-4205-a1ef-92d35adf2717-etc-tuned\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.407588 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407536 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9vt7\" (UniqueName: \"kubernetes.io/projected/e80a0011-4086-4205-a1ef-92d35adf2717-kube-api-access-v9vt7\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.407588 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-lib-modules\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.407666 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.407666 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-multus-cni-dir\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.407666 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-var-lib-cni-bin\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.407806 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407668 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8f257\" (UniqueName: \"kubernetes.io/projected/dadbd785-1d07-45b6-868c-c95e20421c54-kube-api-access-8f257\") pod \"node-resolver-kpdvd\" (UID: \"dadbd785-1d07-45b6-868c-c95e20421c54\") " pod="openshift-dns/node-resolver-kpdvd" Apr 21 14:55:55.407806 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bd6e163c-77e0-40c8-8dc0-b521635bd265-ovnkube-script-lib\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.407806 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407697 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.407806 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-hostroot\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.407806 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407719 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-multus-cni-dir\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.407806 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407721 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-lib-modules\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.407806 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dadbd785-1d07-45b6-868c-c95e20421c54-tmp-dir\") pod \"node-resolver-kpdvd\" (UID: \"dadbd785-1d07-45b6-868c-c95e20421c54\") " pod="openshift-dns/node-resolver-kpdvd" Apr 21 14:55:55.407806 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407787 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-var-lib-cni-bin\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.408204 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407890 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 14:55:55.408204 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-hostroot\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.408204 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.407976 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79gpq\" (UniqueName: \"kubernetes.io/projected/ed61183c-f5d7-41e8-9947-fc9a9e12a3da-kube-api-access-79gpq\") pod \"iptables-alerter-rw6cl\" (UID: \"ed61183c-f5d7-41e8-9947-fc9a9e12a3da\") " pod="openshift-network-operator/iptables-alerter-rw6cl" Apr 21 14:55:55.408204 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408011 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/071f55a0-8b84-4aba-97bf-3b7856b4c800-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.408204 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408041 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z67wk\" (UniqueName: \"kubernetes.io/projected/f531f65c-d73f-48df-b4b9-fffda9589a9e-kube-api-access-z67wk\") pod \"network-metrics-daemon-ktgkr\" (UID: \"f531f65c-d73f-48df-b4b9-fffda9589a9e\") " pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:55:55.408204 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-var-lib-openvswitch\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.408204 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bd6e163c-77e0-40c8-8dc0-b521635bd265-ovn-node-metrics-cert\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.408204 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408131 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2h88b\" (UniqueName: \"kubernetes.io/projected/297ac21d-4aa7-488f-8f40-48d7b969036b-kube-api-access-2h88b\") pod \"node-ca-7h2f9\" (UID: \"297ac21d-4aa7-488f-8f40-48d7b969036b\") " pod="openshift-image-registry/node-ca-7h2f9" Apr 21 14:55:55.408204 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-var-lib-openvswitch\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.408204 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408180 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dadbd785-1d07-45b6-868c-c95e20421c54-tmp-dir\") pod \"node-resolver-kpdvd\" (UID: \"dadbd785-1d07-45b6-868c-c95e20421c54\") " pod="openshift-dns/node-resolver-kpdvd" Apr 21 14:55:55.408204 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-device-dir\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-systemd-units\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408279 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-run-netns\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-device-dir\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/297ac21d-4aa7-488f-8f40-48d7b969036b-host\") pod \"node-ca-7h2f9\" (UID: \"297ac21d-4aa7-488f-8f40-48d7b969036b\") " pod="openshift-image-registry/node-ca-7h2f9" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-run-netns\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408331 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-systemd-units\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408368 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/297ac21d-4aa7-488f-8f40-48d7b969036b-host\") pod \"node-ca-7h2f9\" (UID: \"297ac21d-4aa7-488f-8f40-48d7b969036b\") " pod="openshift-image-registry/node-ca-7h2f9" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-run-netns\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j59sl\" (UniqueName: \"kubernetes.io/projected/91d8fda2-6bab-475f-bb2c-09e739e26078-kube-api-access-j59sl\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-registration-dir\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-run-netns\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408447 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-etc-selinux\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408433 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bd6e163c-77e0-40c8-8dc0-b521635bd265-ovnkube-script-lib\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0c7e2c16-e24b-4449-ae17-3e6e83f0e900-konnectivity-ca\") pod \"konnectivity-agent-jpxzc\" (UID: \"0c7e2c16-e24b-4449-ae17-3e6e83f0e900\") " pod="kube-system/konnectivity-agent-jpxzc" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408498 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-registration-dir\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-systemd\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408566 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-run\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.408767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408539 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-etc-selinux\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-systemd\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/297ac21d-4aa7-488f-8f40-48d7b969036b-serviceca\") pod \"node-ca-7h2f9\" (UID: \"297ac21d-4aa7-488f-8f40-48d7b969036b\") " pod="openshift-image-registry/node-ca-7h2f9" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-multus-conf-dir\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408646 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-run\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0c7e2c16-e24b-4449-ae17-3e6e83f0e900-agent-certs\") pod \"konnectivity-agent-jpxzc\" (UID: \"0c7e2c16-e24b-4449-ae17-3e6e83f0e900\") " pod="kube-system/konnectivity-agent-jpxzc" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-multus-conf-dir\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408699 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/071f55a0-8b84-4aba-97bf-3b7856b4c800-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408777 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-kubelet\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-run-openvswitch\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-system-cni-dir\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-run-k8s-cni-cncf-io\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408876 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-sysconfig\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-sysctl-d\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408937 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-kubelet\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bd6e163c-77e0-40c8-8dc0-b521635bd265-ovnkube-config\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408965 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0c7e2c16-e24b-4449-ae17-3e6e83f0e900-konnectivity-ca\") pod \"konnectivity-agent-jpxzc\" (UID: \"0c7e2c16-e24b-4449-ae17-3e6e83f0e900\") " pod="kube-system/konnectivity-agent-jpxzc" Apr 21 14:55:55.409593 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/297ac21d-4aa7-488f-8f40-48d7b969036b-serviceca\") pod \"node-ca-7h2f9\" (UID: \"297ac21d-4aa7-488f-8f40-48d7b969036b\") " pod="openshift-image-registry/node-ca-7h2f9" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.408995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/91d8fda2-6bab-475f-bb2c-09e739e26078-cni-binary-copy\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-run-openvswitch\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409031 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-system-cni-dir\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409029 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ed61183c-f5d7-41e8-9947-fc9a9e12a3da-iptables-alerter-script\") pod \"iptables-alerter-rw6cl\" (UID: \"ed61183c-f5d7-41e8-9947-fc9a9e12a3da\") " pod="openshift-network-operator/iptables-alerter-rw6cl" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-sysconfig\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-sysctl-d\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409339 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-run-k8s-cni-cncf-io\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/071f55a0-8b84-4aba-97bf-3b7856b4c800-cni-binary-copy\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/91d8fda2-6bab-475f-bb2c-09e739e26078-multus-daemon-config\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-run-ovn\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409464 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-modprobe-d\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409491 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-kubernetes\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409518 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-sysctl-conf\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409544 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bd6e163c-77e0-40c8-8dc0-b521635bd265-env-overrides\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2d8c\" (UniqueName: \"kubernetes.io/projected/bd6e163c-77e0-40c8-8dc0-b521635bd265-kube-api-access-h2d8c\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.410556 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409595 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dadbd785-1d07-45b6-868c-c95e20421c54-hosts-file\") pod \"node-resolver-kpdvd\" (UID: \"dadbd785-1d07-45b6-868c-c95e20421c54\") " pod="openshift-dns/node-resolver-kpdvd" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409629 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bd6e163c-77e0-40c8-8dc0-b521635bd265-ovnkube-config\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nprfg\" (UniqueName: \"kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg\") pod \"network-check-target-zppc2\" (UID: \"cab0d6b8-24b3-4ee8-bf59-c526df4af70b\") " pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-kubernetes\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409731 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-node-log\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-log-socket\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409839 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/91d8fda2-6bab-475f-bb2c-09e739e26078-cni-binary-copy\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-log-socket\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409910 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dadbd785-1d07-45b6-868c-c95e20421c54-hosts-file\") pod \"node-resolver-kpdvd\" (UID: \"dadbd785-1d07-45b6-868c-c95e20421c54\") " pod="openshift-dns/node-resolver-kpdvd" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.409963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-node-log\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-run-ovn\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410085 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-sysctl-conf\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410124 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-run-ovn-kubernetes\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410150 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/91d8fda2-6bab-475f-bb2c-09e739e26078-multus-daemon-config\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410152 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-os-release\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410202 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-os-release\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410208 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed61183c-f5d7-41e8-9947-fc9a9e12a3da-host-slash\") pod \"iptables-alerter-rw6cl\" (UID: \"ed61183c-f5d7-41e8-9947-fc9a9e12a3da\") " pod="openshift-network-operator/iptables-alerter-rw6cl" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410257 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-etc-modprobe-d\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.411334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-run-ovn-kubernetes\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410278 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/071f55a0-8b84-4aba-97bf-3b7856b4c800-system-cni-dir\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-run-multus-certs\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bd6e163c-77e0-40c8-8dc0-b521635bd265-env-overrides\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-run-systemd\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410436 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-cni-netd\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410449 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-run-multus-certs\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-multus-socket-dir-parent\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410487 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-run-systemd\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-sys-fs\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410523 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-slash\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410566 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-sys-fs\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-cni-bin\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-etc-kubernetes\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e80a0011-4086-4205-a1ef-92d35adf2717-tmp\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410634 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-slash\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-var-lib-cni-multus\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410677 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-cni-netd\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.412165 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-host-cni-bin\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-socket-dir\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410727 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-etc-kubernetes\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410730 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-multus-socket-dir-parent\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/071f55a0-8b84-4aba-97bf-3b7856b4c800-os-release\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410802 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-sys\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410805 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-var-lib-cni-multus\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410828 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-var-lib-kubelet\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410877 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-host-var-lib-kubelet\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410930 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-sys\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.410933 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e0c80303-a3d0-4f47-8d58-c49be590b002-socket-dir\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.411030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/071f55a0-8b84-4aba-97bf-3b7856b4c800-cnibin\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.411059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/071f55a0-8b84-4aba-97bf-3b7856b4c800-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.411096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5twkx\" (UniqueName: \"kubernetes.io/projected/071f55a0-8b84-4aba-97bf-3b7856b4c800-kube-api-access-5twkx\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.411124 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-etc-openvswitch\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.411148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-cnibin\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.411173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-var-lib-kubelet\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.413058 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.411191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bd6e163c-77e0-40c8-8dc0-b521635bd265-etc-openvswitch\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.413703 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.411198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-host\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.413703 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.411267 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-var-lib-kubelet\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.413703 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.411209 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/91d8fda2-6bab-475f-bb2c-09e739e26078-cnibin\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.413703 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.411274 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gftkf\" (UniqueName: \"kubernetes.io/projected/e0c80303-a3d0-4f47-8d58-c49be590b002-kube-api-access-gftkf\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.413703 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.411310 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs\") pod \"network-metrics-daemon-ktgkr\" (UID: \"f531f65c-d73f-48df-b4b9-fffda9589a9e\") " pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:55:55.413703 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.411327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e80a0011-4086-4205-a1ef-92d35adf2717-host\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.413703 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.412136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bd6e163c-77e0-40c8-8dc0-b521635bd265-ovn-node-metrics-cert\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.413703 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.412969 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0c7e2c16-e24b-4449-ae17-3e6e83f0e900-agent-certs\") pod \"konnectivity-agent-jpxzc\" (UID: \"0c7e2c16-e24b-4449-ae17-3e6e83f0e900\") " pod="kube-system/konnectivity-agent-jpxzc" Apr 21 14:55:55.413703 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.413047 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e80a0011-4086-4205-a1ef-92d35adf2717-etc-tuned\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.413703 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.413414 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e80a0011-4086-4205-a1ef-92d35adf2717-tmp\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.430608 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.430585 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f257\" (UniqueName: \"kubernetes.io/projected/dadbd785-1d07-45b6-868c-c95e20421c54-kube-api-access-8f257\") pod \"node-resolver-kpdvd\" (UID: \"dadbd785-1d07-45b6-868c-c95e20421c54\") " pod="openshift-dns/node-resolver-kpdvd" Apr 21 14:55:55.438336 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.438316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9vt7\" (UniqueName: \"kubernetes.io/projected/e80a0011-4086-4205-a1ef-92d35adf2717-kube-api-access-v9vt7\") pod \"tuned-6bjz7\" (UID: \"e80a0011-4086-4205-a1ef-92d35adf2717\") " pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.441765 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.441737 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gftkf\" (UniqueName: \"kubernetes.io/projected/e0c80303-a3d0-4f47-8d58-c49be590b002-kube-api-access-gftkf\") pod \"aws-ebs-csi-driver-node-8kdjf\" (UID: \"e0c80303-a3d0-4f47-8d58-c49be590b002\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.444953 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.444927 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j59sl\" (UniqueName: \"kubernetes.io/projected/91d8fda2-6bab-475f-bb2c-09e739e26078-kube-api-access-j59sl\") pod \"multus-5q2lt\" (UID: \"91d8fda2-6bab-475f-bb2c-09e739e26078\") " pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.445167 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.445149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2d8c\" (UniqueName: \"kubernetes.io/projected/bd6e163c-77e0-40c8-8dc0-b521635bd265-kube-api-access-h2d8c\") pod \"ovnkube-node-th598\" (UID: \"bd6e163c-77e0-40c8-8dc0-b521635bd265\") " pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.445231 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.445185 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h88b\" (UniqueName: \"kubernetes.io/projected/297ac21d-4aa7-488f-8f40-48d7b969036b-kube-api-access-2h88b\") pod \"node-ca-7h2f9\" (UID: \"297ac21d-4aa7-488f-8f40-48d7b969036b\") " pod="openshift-image-registry/node-ca-7h2f9" Apr 21 14:55:55.448864 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.448818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal" event={"ID":"16ec0454ffec8b367ca76459ebc7c451","Type":"ContainerStarted","Data":"2d3adc413bf7117bf0c9bf6d502bb58043fdcbd48f2461cbed2376e73211ec81"} Apr 21 14:55:55.449736 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.449712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-121.ec2.internal" event={"ID":"e294e9cc44aca557b7c9191559850248","Type":"ContainerStarted","Data":"c681f5e6086bb193ce33b9189920a6729d3d61a52b23df0f18d29340d8a9c37a"} Apr 21 14:55:55.454357 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:55.454340 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:55:55.454422 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:55.454363 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:55:55.454422 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:55.454373 2576 projected.go:194] Error preparing data for projected volume kube-api-access-nprfg for pod openshift-network-diagnostics/network-check-target-zppc2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:55.454493 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:55.454446 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg podName:cab0d6b8-24b3-4ee8-bf59-c526df4af70b nodeName:}" failed. No retries permitted until 2026-04-21 14:55:55.954420565 +0000 UTC m=+2.089398152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nprfg" (UniqueName: "kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg") pod "network-check-target-zppc2" (UID: "cab0d6b8-24b3-4ee8-bf59-c526df4af70b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:55.512693 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.512665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs\") pod \"network-metrics-daemon-ktgkr\" (UID: \"f531f65c-d73f-48df-b4b9-fffda9589a9e\") " pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:55:55.512693 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.512700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79gpq\" (UniqueName: \"kubernetes.io/projected/ed61183c-f5d7-41e8-9947-fc9a9e12a3da-kube-api-access-79gpq\") pod \"iptables-alerter-rw6cl\" (UID: \"ed61183c-f5d7-41e8-9947-fc9a9e12a3da\") " pod="openshift-network-operator/iptables-alerter-rw6cl" Apr 21 14:55:55.512913 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.512717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/071f55a0-8b84-4aba-97bf-3b7856b4c800-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.512913 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.512734 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z67wk\" (UniqueName: \"kubernetes.io/projected/f531f65c-d73f-48df-b4b9-fffda9589a9e-kube-api-access-z67wk\") pod \"network-metrics-daemon-ktgkr\" (UID: \"f531f65c-d73f-48df-b4b9-fffda9589a9e\") " pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:55:55.512913 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.512801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/071f55a0-8b84-4aba-97bf-3b7856b4c800-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.512913 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.512842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ed61183c-f5d7-41e8-9947-fc9a9e12a3da-iptables-alerter-script\") pod \"iptables-alerter-rw6cl\" (UID: \"ed61183c-f5d7-41e8-9947-fc9a9e12a3da\") " pod="openshift-network-operator/iptables-alerter-rw6cl" Apr 21 14:55:55.512913 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:55.512852 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:55.512913 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.512865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/071f55a0-8b84-4aba-97bf-3b7856b4c800-cni-binary-copy\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.513228 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.512947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed61183c-f5d7-41e8-9947-fc9a9e12a3da-host-slash\") pod \"iptables-alerter-rw6cl\" (UID: \"ed61183c-f5d7-41e8-9947-fc9a9e12a3da\") " pod="openshift-network-operator/iptables-alerter-rw6cl" Apr 21 14:55:55.513228 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.513001 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed61183c-f5d7-41e8-9947-fc9a9e12a3da-host-slash\") pod \"iptables-alerter-rw6cl\" (UID: \"ed61183c-f5d7-41e8-9947-fc9a9e12a3da\") " pod="openshift-network-operator/iptables-alerter-rw6cl" Apr 21 14:55:55.513228 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:55.513144 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs podName:f531f65c-d73f-48df-b4b9-fffda9589a9e nodeName:}" failed. No retries permitted until 2026-04-21 14:55:56.013121588 +0000 UTC m=+2.148099177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs") pod "network-metrics-daemon-ktgkr" (UID: "f531f65c-d73f-48df-b4b9-fffda9589a9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:55.513228 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.513172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/071f55a0-8b84-4aba-97bf-3b7856b4c800-system-cni-dir\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.513453 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.513288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/071f55a0-8b84-4aba-97bf-3b7856b4c800-system-cni-dir\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.513453 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.513311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/071f55a0-8b84-4aba-97bf-3b7856b4c800-os-release\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.513453 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.513339 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/071f55a0-8b84-4aba-97bf-3b7856b4c800-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.513453 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.513352 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/071f55a0-8b84-4aba-97bf-3b7856b4c800-cnibin\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.513453 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.513380 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/071f55a0-8b84-4aba-97bf-3b7856b4c800-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.513453 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.513407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5twkx\" (UniqueName: \"kubernetes.io/projected/071f55a0-8b84-4aba-97bf-3b7856b4c800-kube-api-access-5twkx\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.513453 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.513411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/071f55a0-8b84-4aba-97bf-3b7856b4c800-cnibin\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.513453 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.513425 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/071f55a0-8b84-4aba-97bf-3b7856b4c800-os-release\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.513726 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.513541 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/071f55a0-8b84-4aba-97bf-3b7856b4c800-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.513726 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.513547 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ed61183c-f5d7-41e8-9947-fc9a9e12a3da-iptables-alerter-script\") pod \"iptables-alerter-rw6cl\" (UID: \"ed61183c-f5d7-41e8-9947-fc9a9e12a3da\") " pod="openshift-network-operator/iptables-alerter-rw6cl" Apr 21 14:55:55.513726 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.513630 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/071f55a0-8b84-4aba-97bf-3b7856b4c800-cni-binary-copy\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.513818 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.513776 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/071f55a0-8b84-4aba-97bf-3b7856b4c800-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.528346 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.528285 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79gpq\" (UniqueName: \"kubernetes.io/projected/ed61183c-f5d7-41e8-9947-fc9a9e12a3da-kube-api-access-79gpq\") pod \"iptables-alerter-rw6cl\" (UID: \"ed61183c-f5d7-41e8-9947-fc9a9e12a3da\") " pod="openshift-network-operator/iptables-alerter-rw6cl" Apr 21 14:55:55.528747 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.528727 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5twkx\" (UniqueName: \"kubernetes.io/projected/071f55a0-8b84-4aba-97bf-3b7856b4c800-kube-api-access-5twkx\") pod \"multus-additional-cni-plugins-sldvt\" (UID: \"071f55a0-8b84-4aba-97bf-3b7856b4c800\") " pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.528833 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.528813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z67wk\" (UniqueName: \"kubernetes.io/projected/f531f65c-d73f-48df-b4b9-fffda9589a9e-kube-api-access-z67wk\") pod \"network-metrics-daemon-ktgkr\" (UID: \"f531f65c-d73f-48df-b4b9-fffda9589a9e\") " pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:55:55.554226 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.554204 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:55.615992 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.615961 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" Apr 21 14:55:55.622687 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:55.622655 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0c80303_a3d0_4f47_8d58_c49be590b002.slice/crio-84462aca88250036d410d1358d91d012e061d1cc4de61a86c842e96a40cac0b6 WatchSource:0}: Error finding container 84462aca88250036d410d1358d91d012e061d1cc4de61a86c842e96a40cac0b6: Status 404 returned error can't find the container with id 84462aca88250036d410d1358d91d012e061d1cc4de61a86c842e96a40cac0b6 Apr 21 14:55:55.640655 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.640631 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" Apr 21 14:55:55.647318 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:55.647289 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode80a0011_4086_4205_a1ef_92d35adf2717.slice/crio-b0379d33ef9eb2fe1ce043959a53d5c72f03392bbe4ef9e618a08b32a02009b6 WatchSource:0}: Error finding container b0379d33ef9eb2fe1ce043959a53d5c72f03392bbe4ef9e618a08b32a02009b6: Status 404 returned error can't find the container with id b0379d33ef9eb2fe1ce043959a53d5c72f03392bbe4ef9e618a08b32a02009b6 Apr 21 14:55:55.654066 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.654047 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kpdvd" Apr 21 14:55:55.661023 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:55.661000 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddadbd785_1d07_45b6_868c_c95e20421c54.slice/crio-4ef8f2dedf51cd2b079c550fb7e7a63fa3b3368ab20c0fd77463de3a7e0a3923 WatchSource:0}: Error finding container 4ef8f2dedf51cd2b079c550fb7e7a63fa3b3368ab20c0fd77463de3a7e0a3923: Status 404 returned error can't find the container with id 4ef8f2dedf51cd2b079c550fb7e7a63fa3b3368ab20c0fd77463de3a7e0a3923 Apr 21 14:55:55.669471 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.669455 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:55:55.675547 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:55.675524 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd6e163c_77e0_40c8_8dc0_b521635bd265.slice/crio-47c348df7b20e308f8b106a044298c17ef4aaaabcf689e4f2f37c8e88a52d0fc WatchSource:0}: Error finding container 47c348df7b20e308f8b106a044298c17ef4aaaabcf689e4f2f37c8e88a52d0fc: Status 404 returned error can't find the container with id 47c348df7b20e308f8b106a044298c17ef4aaaabcf689e4f2f37c8e88a52d0fc Apr 21 14:55:55.686146 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.686125 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jpxzc" Apr 21 14:55:55.692232 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:55.692206 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c7e2c16_e24b_4449_ae17_3e6e83f0e900.slice/crio-a68f463f8164855ade7e50c3d78818c7ea082957a3a7f9bd86f0fb98af0e3549 WatchSource:0}: Error finding container a68f463f8164855ade7e50c3d78818c7ea082957a3a7f9bd86f0fb98af0e3549: Status 404 returned error can't find the container with id a68f463f8164855ade7e50c3d78818c7ea082957a3a7f9bd86f0fb98af0e3549 Apr 21 14:55:55.718940 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.718911 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sldvt" Apr 21 14:55:55.719061 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.718963 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5q2lt" Apr 21 14:55:55.719266 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.718919 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7h2f9" Apr 21 14:55:55.735122 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:55.734937 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rw6cl" Apr 21 14:55:55.736347 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:55.736317 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod071f55a0_8b84_4aba_97bf_3b7856b4c800.slice/crio-c8fca705fdbad8fdbd3fdb2f437621cd27ad5e1cf701036ee3ffe5a60aed4a55 WatchSource:0}: Error finding container c8fca705fdbad8fdbd3fdb2f437621cd27ad5e1cf701036ee3ffe5a60aed4a55: Status 404 returned error can't find the container with id c8fca705fdbad8fdbd3fdb2f437621cd27ad5e1cf701036ee3ffe5a60aed4a55 Apr 21 14:55:55.736898 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:55.736816 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod297ac21d_4aa7_488f_8f40_48d7b969036b.slice/crio-3f51392c56bb8c2097ae96f254cda7be1a14bbce04fc627aef56828574f87120 WatchSource:0}: Error finding container 3f51392c56bb8c2097ae96f254cda7be1a14bbce04fc627aef56828574f87120: Status 404 returned error can't find the container with id 3f51392c56bb8c2097ae96f254cda7be1a14bbce04fc627aef56828574f87120 Apr 21 14:55:55.738213 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:55:55.738185 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91d8fda2_6bab_475f_bb2c_09e739e26078.slice/crio-6114fad325f103624e7532a7eaaf618c17a30d70c44cd43bd033e1e49d7ba31b WatchSource:0}: Error finding container 6114fad325f103624e7532a7eaaf618c17a30d70c44cd43bd033e1e49d7ba31b: Status 404 returned error can't find the container with id 6114fad325f103624e7532a7eaaf618c17a30d70c44cd43bd033e1e49d7ba31b Apr 21 14:55:56.021422 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.021338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nprfg\" (UniqueName: \"kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg\") pod \"network-check-target-zppc2\" (UID: \"cab0d6b8-24b3-4ee8-bf59-c526df4af70b\") " pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:55:56.021422 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.021407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs\") pod \"network-metrics-daemon-ktgkr\" (UID: \"f531f65c-d73f-48df-b4b9-fffda9589a9e\") " pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:55:56.021627 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:56.021536 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:56.021627 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:56.021594 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs podName:f531f65c-d73f-48df-b4b9-fffda9589a9e nodeName:}" failed. No retries permitted until 2026-04-21 14:55:57.021575983 +0000 UTC m=+3.156553573 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs") pod "network-metrics-daemon-ktgkr" (UID: "f531f65c-d73f-48df-b4b9-fffda9589a9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:56.022051 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:56.022029 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:55:56.022131 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:56.022060 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:55:56.022131 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:56.022072 2576 projected.go:194] Error preparing data for projected volume kube-api-access-nprfg for pod openshift-network-diagnostics/network-check-target-zppc2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:56.022131 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:56.022118 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg podName:cab0d6b8-24b3-4ee8-bf59-c526df4af70b nodeName:}" failed. No retries permitted until 2026-04-21 14:55:57.022100978 +0000 UTC m=+3.157078574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-nprfg" (UniqueName: "kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg") pod "network-check-target-zppc2" (UID: "cab0d6b8-24b3-4ee8-bf59-c526df4af70b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:56.385400 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.385010 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 14:50:55 +0000 UTC" deadline="2027-09-30 01:59:49.656151512 +0000 UTC" Apr 21 14:55:56.385400 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.385288 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12635h3m53.270870421s" Apr 21 14:55:56.448053 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.448015 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:55:56.454908 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:56.452583 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:55:56.467401 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.467368 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" event={"ID":"e0c80303-a3d0-4f47-8d58-c49be590b002","Type":"ContainerStarted","Data":"84462aca88250036d410d1358d91d012e061d1cc4de61a86c842e96a40cac0b6"} Apr 21 14:55:56.483291 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.483048 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:56.483291 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.483209 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7h2f9" event={"ID":"297ac21d-4aa7-488f-8f40-48d7b969036b","Type":"ContainerStarted","Data":"3f51392c56bb8c2097ae96f254cda7be1a14bbce04fc627aef56828574f87120"} Apr 21 14:55:56.485492 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.485460 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" event={"ID":"e80a0011-4086-4205-a1ef-92d35adf2717","Type":"ContainerStarted","Data":"b0379d33ef9eb2fe1ce043959a53d5c72f03392bbe4ef9e618a08b32a02009b6"} Apr 21 14:55:56.488435 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.488409 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rw6cl" event={"ID":"ed61183c-f5d7-41e8-9947-fc9a9e12a3da","Type":"ContainerStarted","Data":"7aa745dac39a0aeb35aa556e67c7189ebff504806911bc0c1e009e4109fc2bfc"} Apr 21 14:55:56.497631 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.497600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5q2lt" event={"ID":"91d8fda2-6bab-475f-bb2c-09e739e26078","Type":"ContainerStarted","Data":"6114fad325f103624e7532a7eaaf618c17a30d70c44cd43bd033e1e49d7ba31b"} Apr 21 14:55:56.502614 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.502583 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sldvt" event={"ID":"071f55a0-8b84-4aba-97bf-3b7856b4c800","Type":"ContainerStarted","Data":"c8fca705fdbad8fdbd3fdb2f437621cd27ad5e1cf701036ee3ffe5a60aed4a55"} Apr 21 14:55:56.515505 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.515471 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jpxzc" event={"ID":"0c7e2c16-e24b-4449-ae17-3e6e83f0e900","Type":"ContainerStarted","Data":"a68f463f8164855ade7e50c3d78818c7ea082957a3a7f9bd86f0fb98af0e3549"} Apr 21 14:55:56.527961 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.527932 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:56.537118 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.537086 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-th598" event={"ID":"bd6e163c-77e0-40c8-8dc0-b521635bd265","Type":"ContainerStarted","Data":"47c348df7b20e308f8b106a044298c17ef4aaaabcf689e4f2f37c8e88a52d0fc"} Apr 21 14:55:56.549609 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:56.549572 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kpdvd" event={"ID":"dadbd785-1d07-45b6-868c-c95e20421c54","Type":"ContainerStarted","Data":"4ef8f2dedf51cd2b079c550fb7e7a63fa3b3368ab20c0fd77463de3a7e0a3923"} Apr 21 14:55:57.030019 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:57.029163 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nprfg\" (UniqueName: \"kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg\") pod \"network-check-target-zppc2\" (UID: \"cab0d6b8-24b3-4ee8-bf59-c526df4af70b\") " pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:55:57.030019 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:57.029229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs\") pod \"network-metrics-daemon-ktgkr\" (UID: \"f531f65c-d73f-48df-b4b9-fffda9589a9e\") " pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:55:57.030019 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:57.029367 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:57.030019 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:57.029433 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs podName:f531f65c-d73f-48df-b4b9-fffda9589a9e nodeName:}" failed. No retries permitted until 2026-04-21 14:55:59.029412042 +0000 UTC m=+5.164389634 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs") pod "network-metrics-daemon-ktgkr" (UID: "f531f65c-d73f-48df-b4b9-fffda9589a9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:57.030019 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:57.029867 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:55:57.030019 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:57.029888 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:55:57.030019 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:57.029901 2576 projected.go:194] Error preparing data for projected volume kube-api-access-nprfg for pod openshift-network-diagnostics/network-check-target-zppc2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:57.030019 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:57.029944 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg podName:cab0d6b8-24b3-4ee8-bf59-c526df4af70b nodeName:}" failed. No retries permitted until 2026-04-21 14:55:59.029929271 +0000 UTC m=+5.164906866 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-nprfg" (UniqueName: "kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg") pod "network-check-target-zppc2" (UID: "cab0d6b8-24b3-4ee8-bf59-c526df4af70b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:57.386069 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:57.385975 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 14:50:55 +0000 UTC" deadline="2028-02-02 19:07:40.68176819 +0000 UTC" Apr 21 14:55:57.386069 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:57.386012 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15652h11m43.295759531s" Apr 21 14:55:57.446453 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:57.446418 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:55:57.446642 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:57.446578 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:55:58.058696 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:58.058659 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-d7bn9"] Apr 21 14:55:58.060568 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:58.060544 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:55:58.060708 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:58.060625 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:55:58.138896 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:58.138858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/04bb98f2-3b25-4aa5-aa5b-4484506ce286-kubelet-config\") pod \"global-pull-secret-syncer-d7bn9\" (UID: \"04bb98f2-3b25-4aa5-aa5b-4484506ce286\") " pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:55:58.139065 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:58.138955 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/04bb98f2-3b25-4aa5-aa5b-4484506ce286-dbus\") pod \"global-pull-secret-syncer-d7bn9\" (UID: \"04bb98f2-3b25-4aa5-aa5b-4484506ce286\") " pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:55:58.139065 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:58.138991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret\") pod \"global-pull-secret-syncer-d7bn9\" (UID: \"04bb98f2-3b25-4aa5-aa5b-4484506ce286\") " pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:55:58.239383 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:58.239346 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/04bb98f2-3b25-4aa5-aa5b-4484506ce286-kubelet-config\") pod \"global-pull-secret-syncer-d7bn9\" (UID: \"04bb98f2-3b25-4aa5-aa5b-4484506ce286\") " pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:55:58.239550 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:58.239444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/04bb98f2-3b25-4aa5-aa5b-4484506ce286-dbus\") pod \"global-pull-secret-syncer-d7bn9\" (UID: \"04bb98f2-3b25-4aa5-aa5b-4484506ce286\") " pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:55:58.239550 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:58.239483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret\") pod \"global-pull-secret-syncer-d7bn9\" (UID: \"04bb98f2-3b25-4aa5-aa5b-4484506ce286\") " pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:55:58.239669 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:58.239616 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:55:58.239730 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:58.239675 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret podName:04bb98f2-3b25-4aa5-aa5b-4484506ce286 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:58.739656273 +0000 UTC m=+4.874633874 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret") pod "global-pull-secret-syncer-d7bn9" (UID: "04bb98f2-3b25-4aa5-aa5b-4484506ce286") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:55:58.240313 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:58.239928 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/04bb98f2-3b25-4aa5-aa5b-4484506ce286-kubelet-config\") pod \"global-pull-secret-syncer-d7bn9\" (UID: \"04bb98f2-3b25-4aa5-aa5b-4484506ce286\") " pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:55:58.240313 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:58.240075 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/04bb98f2-3b25-4aa5-aa5b-4484506ce286-dbus\") pod \"global-pull-secret-syncer-d7bn9\" (UID: \"04bb98f2-3b25-4aa5-aa5b-4484506ce286\") " pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:55:58.448148 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:58.448071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:55:58.448597 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:58.448197 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:55:58.743452 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:58.742829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret\") pod \"global-pull-secret-syncer-d7bn9\" (UID: \"04bb98f2-3b25-4aa5-aa5b-4484506ce286\") " pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:55:58.743452 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:58.742972 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:55:58.743452 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:58.743029 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret podName:04bb98f2-3b25-4aa5-aa5b-4484506ce286 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:59.743015787 +0000 UTC m=+5.877993393 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret") pod "global-pull-secret-syncer-d7bn9" (UID: "04bb98f2-3b25-4aa5-aa5b-4484506ce286") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:55:59.046274 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:59.045327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nprfg\" (UniqueName: \"kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg\") pod \"network-check-target-zppc2\" (UID: \"cab0d6b8-24b3-4ee8-bf59-c526df4af70b\") " pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:55:59.046274 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:59.045397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs\") pod \"network-metrics-daemon-ktgkr\" (UID: \"f531f65c-d73f-48df-b4b9-fffda9589a9e\") " pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:55:59.046274 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:59.045546 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:59.046274 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:59.045612 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs podName:f531f65c-d73f-48df-b4b9-fffda9589a9e nodeName:}" failed. No retries permitted until 2026-04-21 14:56:03.045592082 +0000 UTC m=+9.180569671 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs") pod "network-metrics-daemon-ktgkr" (UID: "f531f65c-d73f-48df-b4b9-fffda9589a9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:59.046274 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:59.046013 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:55:59.046274 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:59.046031 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:55:59.046274 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:59.046046 2576 projected.go:194] Error preparing data for projected volume kube-api-access-nprfg for pod openshift-network-diagnostics/network-check-target-zppc2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:59.046274 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:59.046088 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg podName:cab0d6b8-24b3-4ee8-bf59-c526df4af70b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:03.046073998 +0000 UTC m=+9.181051599 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-nprfg" (UniqueName: "kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg") pod "network-check-target-zppc2" (UID: "cab0d6b8-24b3-4ee8-bf59-c526df4af70b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:59.445893 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:59.445808 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:55:59.446065 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:59.445945 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:55:59.446383 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:59.446364 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:55:59.446492 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:59.446472 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:55:59.752046 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:55:59.751955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret\") pod \"global-pull-secret-syncer-d7bn9\" (UID: \"04bb98f2-3b25-4aa5-aa5b-4484506ce286\") " pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:55:59.752494 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:59.752096 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:55:59.752494 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:55:59.752157 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret podName:04bb98f2-3b25-4aa5-aa5b-4484506ce286 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:01.752136498 +0000 UTC m=+7.887114121 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret") pod "global-pull-secret-syncer-d7bn9" (UID: "04bb98f2-3b25-4aa5-aa5b-4484506ce286") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:56:00.448902 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:00.448868 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:00.449092 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:00.448990 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:56:01.446528 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:01.446490 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:01.447009 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:01.446635 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:56:01.447097 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:01.447066 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:01.447190 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:01.447168 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:56:01.769778 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:01.769681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret\") pod \"global-pull-secret-syncer-d7bn9\" (UID: \"04bb98f2-3b25-4aa5-aa5b-4484506ce286\") " pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:01.769954 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:01.769862 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:56:01.769954 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:01.769944 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret podName:04bb98f2-3b25-4aa5-aa5b-4484506ce286 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:05.769923036 +0000 UTC m=+11.904900630 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret") pod "global-pull-secret-syncer-d7bn9" (UID: "04bb98f2-3b25-4aa5-aa5b-4484506ce286") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:56:02.445954 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:02.445922 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:02.446129 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:02.446051 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:56:03.082002 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:03.081964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nprfg\" (UniqueName: \"kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg\") pod \"network-check-target-zppc2\" (UID: \"cab0d6b8-24b3-4ee8-bf59-c526df4af70b\") " pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:03.082503 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:03.082026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs\") pod \"network-metrics-daemon-ktgkr\" (UID: \"f531f65c-d73f-48df-b4b9-fffda9589a9e\") " pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:03.082503 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:03.082140 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:56:03.082503 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:03.082159 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:03.082503 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:03.082166 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:56:03.082503 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:03.082180 2576 projected.go:194] Error preparing data for projected volume kube-api-access-nprfg for pod openshift-network-diagnostics/network-check-target-zppc2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:03.082503 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:03.082231 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs podName:f531f65c-d73f-48df-b4b9-fffda9589a9e nodeName:}" failed. No retries permitted until 2026-04-21 14:56:11.082211217 +0000 UTC m=+17.217188803 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs") pod "network-metrics-daemon-ktgkr" (UID: "f531f65c-d73f-48df-b4b9-fffda9589a9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:03.082503 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:03.082271 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg podName:cab0d6b8-24b3-4ee8-bf59-c526df4af70b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:11.082259961 +0000 UTC m=+17.217237555 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-nprfg" (UniqueName: "kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg") pod "network-check-target-zppc2" (UID: "cab0d6b8-24b3-4ee8-bf59-c526df4af70b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:03.446691 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:03.446586 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:03.446861 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:03.446744 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:56:03.446861 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:03.446823 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:03.446972 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:03.446928 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:56:04.446915 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:04.446825 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:04.447336 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:04.446941 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:56:05.446764 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:05.446723 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:05.446953 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:05.446723 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:05.446953 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:05.446866 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:56:05.447279 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:05.446953 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:56:05.803003 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:05.802912 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret\") pod \"global-pull-secret-syncer-d7bn9\" (UID: \"04bb98f2-3b25-4aa5-aa5b-4484506ce286\") " pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:05.803161 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:05.803065 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:56:05.803161 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:05.803137 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret podName:04bb98f2-3b25-4aa5-aa5b-4484506ce286 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:13.803117709 +0000 UTC m=+19.938095319 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret") pod "global-pull-secret-syncer-d7bn9" (UID: "04bb98f2-3b25-4aa5-aa5b-4484506ce286") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:56:06.446614 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:06.446577 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:06.446799 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:06.446704 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:56:07.446188 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:07.446149 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:07.446639 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:07.446162 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:07.446639 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:07.446304 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:56:07.446639 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:07.446380 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:56:08.445987 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:08.445946 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:08.446153 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:08.446080 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:56:09.446483 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:09.446447 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:09.446926 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:09.446574 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:56:09.446926 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:09.446633 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:09.447202 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:09.447153 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:56:10.446579 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:10.446044 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:10.446579 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:10.446179 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:56:11.139109 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:11.139063 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nprfg\" (UniqueName: \"kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg\") pod \"network-check-target-zppc2\" (UID: \"cab0d6b8-24b3-4ee8-bf59-c526df4af70b\") " pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:11.139353 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:11.139136 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs\") pod \"network-metrics-daemon-ktgkr\" (UID: \"f531f65c-d73f-48df-b4b9-fffda9589a9e\") " pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:11.139353 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:11.139257 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:56:11.139353 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:11.139284 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:56:11.139353 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:11.139296 2576 projected.go:194] Error preparing data for projected volume kube-api-access-nprfg for pod openshift-network-diagnostics/network-check-target-zppc2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:11.139353 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:11.139293 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:11.139545 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:11.139363 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg podName:cab0d6b8-24b3-4ee8-bf59-c526df4af70b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:27.139343304 +0000 UTC m=+33.274320891 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-nprfg" (UniqueName: "kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg") pod "network-check-target-zppc2" (UID: "cab0d6b8-24b3-4ee8-bf59-c526df4af70b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:11.139545 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:11.139408 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs podName:f531f65c-d73f-48df-b4b9-fffda9589a9e nodeName:}" failed. No retries permitted until 2026-04-21 14:56:27.139373583 +0000 UTC m=+33.274351172 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs") pod "network-metrics-daemon-ktgkr" (UID: "f531f65c-d73f-48df-b4b9-fffda9589a9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:11.446156 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:11.446086 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:11.446319 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:11.446095 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:11.446319 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:11.446203 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:56:11.446319 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:11.446295 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:56:12.446772 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:12.446728 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:12.447198 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:12.446870 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:56:13.445973 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:13.445927 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:13.446156 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:13.445933 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:13.446156 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:13.446055 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:56:13.446156 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:13.446140 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:56:13.859346 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:13.859306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret\") pod \"global-pull-secret-syncer-d7bn9\" (UID: \"04bb98f2-3b25-4aa5-aa5b-4484506ce286\") " pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:13.859681 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:13.859408 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 14:56:13.859681 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:13.859473 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret podName:04bb98f2-3b25-4aa5-aa5b-4484506ce286 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:29.859453758 +0000 UTC m=+35.994431345 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret") pod "global-pull-secret-syncer-d7bn9" (UID: "04bb98f2-3b25-4aa5-aa5b-4484506ce286") : object "kube-system"/"original-pull-secret" not registered Apr 21 14:56:14.447298 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:14.447095 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:14.447415 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:14.447376 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:56:14.595664 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:14.595629 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-121.ec2.internal" event={"ID":"e294e9cc44aca557b7c9191559850248","Type":"ContainerStarted","Data":"db4f67104077e95e1ff86fb2786a93b387b53c872a56f6ff88b6fa39b5f9b11c"} Apr 21 14:56:14.598318 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:14.598294 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" event={"ID":"e80a0011-4086-4205-a1ef-92d35adf2717","Type":"ContainerStarted","Data":"05f282a69d03a188e2bc060a9160e477f0a07fd4d7968798245533831435e4a0"} Apr 21 14:56:14.601899 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:14.601877 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5q2lt" event={"ID":"91d8fda2-6bab-475f-bb2c-09e739e26078","Type":"ContainerStarted","Data":"0301c13fa5b5d71c28abcf14554e1bbb008d8f20cb2965d3757c1bec5ec5c344"} Apr 21 14:56:14.616413 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:14.616355 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-th598" event={"ID":"bd6e163c-77e0-40c8-8dc0-b521635bd265","Type":"ContainerStarted","Data":"2b2a3160f5da2d3a54efaab341c27c5c8cc4de05475e60f9350c0e14e5a26b15"} Apr 21 14:56:14.616514 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:14.616425 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-th598" event={"ID":"bd6e163c-77e0-40c8-8dc0-b521635bd265","Type":"ContainerStarted","Data":"71c08f79b08ae616830df4c3c9a8da3bd0e10be9eec418d372d24d5222ed2fda"} Apr 21 14:56:14.629928 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:14.629866 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-121.ec2.internal" podStartSLOduration=19.629847318 podStartE2EDuration="19.629847318s" podCreationTimestamp="2026-04-21 14:55:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:56:14.611910836 +0000 UTC m=+20.746888448" watchObservedRunningTime="2026-04-21 14:56:14.629847318 +0000 UTC m=+20.764824932" Apr 21 14:56:14.630193 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:14.630150 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6bjz7" podStartSLOduration=2.561888448 podStartE2EDuration="20.630137709s" podCreationTimestamp="2026-04-21 14:55:54 +0000 UTC" firstStartedPulling="2026-04-21 14:55:55.648697606 +0000 UTC m=+1.783675194" lastFinishedPulling="2026-04-21 14:56:13.716946851 +0000 UTC m=+19.851924455" observedRunningTime="2026-04-21 14:56:14.628835039 +0000 UTC m=+20.763812649" watchObservedRunningTime="2026-04-21 14:56:14.630137709 +0000 UTC m=+20.765115319" Apr 21 14:56:15.446931 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.446740 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:15.447694 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.446740 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:15.447694 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:15.447023 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:56:15.447694 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:15.447142 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:56:15.619976 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.619940 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rw6cl" event={"ID":"ed61183c-f5d7-41e8-9947-fc9a9e12a3da","Type":"ContainerStarted","Data":"6d5a6a4fd8a0477b3a57bebacdbdc7316306ac2eac08121bb7dce3ee51cae9c9"} Apr 21 14:56:15.621423 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.621392 2576 generic.go:358] "Generic (PLEG): container finished" podID="071f55a0-8b84-4aba-97bf-3b7856b4c800" containerID="a2d137b69d623d0d0252c2ecc4ba9eef88a7b508713fbedb57338898f6a7ae8b" exitCode=0 Apr 21 14:56:15.621555 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.621477 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sldvt" event={"ID":"071f55a0-8b84-4aba-97bf-3b7856b4c800","Type":"ContainerDied","Data":"a2d137b69d623d0d0252c2ecc4ba9eef88a7b508713fbedb57338898f6a7ae8b"} Apr 21 14:56:15.622822 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.622800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jpxzc" event={"ID":"0c7e2c16-e24b-4449-ae17-3e6e83f0e900","Type":"ContainerStarted","Data":"b6f9e15d1bd9110262458c3c434491f5fd863e50703acac12382960b9e865417"} Apr 21 14:56:15.625692 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.625671 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-th598" event={"ID":"bd6e163c-77e0-40c8-8dc0-b521635bd265","Type":"ContainerStarted","Data":"e76de2d3e3a9dc663a8f609eeeb42265ed339fd484d253fba59c97023ea900b1"} Apr 21 14:56:15.625782 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.625695 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-th598" event={"ID":"bd6e163c-77e0-40c8-8dc0-b521635bd265","Type":"ContainerStarted","Data":"b4a026df323b342b73f6f763dbee420625da259354e3e4392e192b16fd87d4ed"} Apr 21 14:56:15.625782 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.625707 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-th598" event={"ID":"bd6e163c-77e0-40c8-8dc0-b521635bd265","Type":"ContainerStarted","Data":"e1b585f389a208df437e0cf7a757c3808d90175006da061eccbed451b0a6f60e"} Apr 21 14:56:15.625782 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.625720 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-th598" event={"ID":"bd6e163c-77e0-40c8-8dc0-b521635bd265","Type":"ContainerStarted","Data":"90477e6a595965d0b5e42085508b7e0620af2fa651a6238adfe39f143cc5f1c4"} Apr 21 14:56:15.626988 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.626968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kpdvd" event={"ID":"dadbd785-1d07-45b6-868c-c95e20421c54","Type":"ContainerStarted","Data":"30defceff83dfd21a7aa82852402e763578c5b7bd2c8ea09f22d70e2e78d7238"} Apr 21 14:56:15.628332 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.628301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" event={"ID":"e0c80303-a3d0-4f47-8d58-c49be590b002","Type":"ContainerStarted","Data":"32679a07e8d2799df38a2eabfd84774ea88d383650758fc03d1edd57b7ea89c9"} Apr 21 14:56:15.629735 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.629711 2576 generic.go:358] "Generic (PLEG): container finished" podID="16ec0454ffec8b367ca76459ebc7c451" containerID="c21d4c1448519a2ed1dd9664d335808654e0e03bbead5c8e84ff637b537f8461" exitCode=0 Apr 21 14:56:15.629828 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.629777 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal" event={"ID":"16ec0454ffec8b367ca76459ebc7c451","Type":"ContainerDied","Data":"c21d4c1448519a2ed1dd9664d335808654e0e03bbead5c8e84ff637b537f8461"} Apr 21 14:56:15.631178 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.631142 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7h2f9" event={"ID":"297ac21d-4aa7-488f-8f40-48d7b969036b","Type":"ContainerStarted","Data":"92517f338cae2a61586857d41d7c9c6985cca596fc4d650ec7bf1b2247be867d"} Apr 21 14:56:15.634198 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.634132 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5q2lt" podStartSLOduration=3.63787021 podStartE2EDuration="21.634117918s" podCreationTimestamp="2026-04-21 14:55:54 +0000 UTC" firstStartedPulling="2026-04-21 14:55:55.740289258 +0000 UTC m=+1.875266845" lastFinishedPulling="2026-04-21 14:56:13.736536964 +0000 UTC m=+19.871514553" observedRunningTime="2026-04-21 14:56:14.649298892 +0000 UTC m=+20.784276502" watchObservedRunningTime="2026-04-21 14:56:15.634117918 +0000 UTC m=+21.769095527" Apr 21 14:56:15.634512 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.634480 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rw6cl" podStartSLOduration=3.658316011 podStartE2EDuration="21.63447381s" podCreationTimestamp="2026-04-21 14:55:54 +0000 UTC" firstStartedPulling="2026-04-21 14:55:55.744715339 +0000 UTC m=+1.879692925" lastFinishedPulling="2026-04-21 14:56:13.72087313 +0000 UTC m=+19.855850724" observedRunningTime="2026-04-21 14:56:15.634261776 +0000 UTC m=+21.769239367" watchObservedRunningTime="2026-04-21 14:56:15.63447381 +0000 UTC m=+21.769451420" Apr 21 14:56:15.647857 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.647814 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jpxzc" podStartSLOduration=3.655839011 podStartE2EDuration="21.647800173s" podCreationTimestamp="2026-04-21 14:55:54 +0000 UTC" firstStartedPulling="2026-04-21 14:55:55.693628313 +0000 UTC m=+1.828605900" lastFinishedPulling="2026-04-21 14:56:13.68558946 +0000 UTC m=+19.820567062" observedRunningTime="2026-04-21 14:56:15.647437013 +0000 UTC m=+21.782414622" watchObservedRunningTime="2026-04-21 14:56:15.647800173 +0000 UTC m=+21.782777816" Apr 21 14:56:15.676715 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.676598 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kpdvd" podStartSLOduration=3.654080043 podStartE2EDuration="21.676581319s" podCreationTimestamp="2026-04-21 14:55:54 +0000 UTC" firstStartedPulling="2026-04-21 14:55:55.66275982 +0000 UTC m=+1.797737407" lastFinishedPulling="2026-04-21 14:56:13.685261081 +0000 UTC m=+19.820238683" observedRunningTime="2026-04-21 14:56:15.676406388 +0000 UTC m=+21.811383996" watchObservedRunningTime="2026-04-21 14:56:15.676581319 +0000 UTC m=+21.811558929" Apr 21 14:56:15.691147 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.691074 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7h2f9" podStartSLOduration=3.745618853 podStartE2EDuration="21.691061306s" podCreationTimestamp="2026-04-21 14:55:54 +0000 UTC" firstStartedPulling="2026-04-21 14:55:55.739855751 +0000 UTC m=+1.874833352" lastFinishedPulling="2026-04-21 14:56:13.685298214 +0000 UTC m=+19.820275805" observedRunningTime="2026-04-21 14:56:15.690646955 +0000 UTC m=+21.825624566" watchObservedRunningTime="2026-04-21 14:56:15.691061306 +0000 UTC m=+21.826038915" Apr 21 14:56:15.941354 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:15.941329 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 14:56:16.369707 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:16.369494 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T14:56:15.94135147Z","UUID":"7fe06359-519e-42ef-ae74-0ce80b71fc0b","Handler":null,"Name":"","Endpoint":""} Apr 21 14:56:16.372111 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:16.372085 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 14:56:16.372311 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:16.372121 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 14:56:16.446462 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:16.446430 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:16.446630 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:16.446555 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:56:16.634747 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:16.634647 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" event={"ID":"e0c80303-a3d0-4f47-8d58-c49be590b002","Type":"ContainerStarted","Data":"4f792a57feed8eba2c9e29b8cb0ed86b725f0738bab383cd235bec3c19105ab2"} Apr 21 14:56:16.639145 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:16.639111 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal" event={"ID":"16ec0454ffec8b367ca76459ebc7c451","Type":"ContainerStarted","Data":"7e48ab8079b64698595d6aa6780b7e1ae6a9fca8bf4733c123d8f0fecbe89111"} Apr 21 14:56:16.661418 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:16.661372 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-121.ec2.internal" podStartSLOduration=21.661357321 podStartE2EDuration="21.661357321s" podCreationTimestamp="2026-04-21 14:55:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:56:16.660794836 +0000 UTC m=+22.795772446" watchObservedRunningTime="2026-04-21 14:56:16.661357321 +0000 UTC m=+22.796334930" Apr 21 14:56:17.446642 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:17.446400 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:17.446816 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:17.446400 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:17.446816 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:17.446690 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:56:17.446902 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:17.446815 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:56:17.642951 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:17.642915 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-th598" event={"ID":"bd6e163c-77e0-40c8-8dc0-b521635bd265","Type":"ContainerStarted","Data":"048184cd9cd3cf865e63640983781012315d1f1dfc9757dae94dae26cbfd3c77"} Apr 21 14:56:17.645321 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:17.644793 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" event={"ID":"e0c80303-a3d0-4f47-8d58-c49be590b002","Type":"ContainerStarted","Data":"69ce5277e4fa63266b8922c6ed74c01740fe09fe2b9a679cae613e82665212d6"} Apr 21 14:56:17.661027 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:17.660978 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kdjf" podStartSLOduration=2.5601087270000003 podStartE2EDuration="23.660963981s" podCreationTimestamp="2026-04-21 14:55:54 +0000 UTC" firstStartedPulling="2026-04-21 14:55:55.624275758 +0000 UTC m=+1.759253344" lastFinishedPulling="2026-04-21 14:56:16.725130997 +0000 UTC m=+22.860108598" observedRunningTime="2026-04-21 14:56:17.660506325 +0000 UTC m=+23.795483935" watchObservedRunningTime="2026-04-21 14:56:17.660963981 +0000 UTC m=+23.795941589" Apr 21 14:56:18.384787 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:18.384751 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jpxzc" Apr 21 14:56:18.385417 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:18.385383 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jpxzc" Apr 21 14:56:18.445949 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:18.445911 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:18.446119 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:18.446050 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:56:18.647034 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:18.646956 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jpxzc" Apr 21 14:56:18.647607 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:18.647405 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jpxzc" Apr 21 14:56:19.446306 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:19.446210 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:19.446473 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:19.446210 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:19.446473 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:19.446350 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:56:19.446473 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:19.446424 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:56:20.446587 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:20.446387 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:20.447264 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:20.446660 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:56:20.652662 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:20.652617 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-th598" event={"ID":"bd6e163c-77e0-40c8-8dc0-b521635bd265","Type":"ContainerStarted","Data":"5f3048cb1bec9c6a676e6071282429dac6c2008d261e30c4e0f9bc56f3478bcb"} Apr 21 14:56:20.652934 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:20.652906 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:56:20.652934 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:20.652941 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:56:20.654318 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:20.654296 2576 generic.go:358] "Generic (PLEG): container finished" podID="071f55a0-8b84-4aba-97bf-3b7856b4c800" containerID="e2d0b6a00340027cb1c2df397ee8f0b4a22ec1ca64066fd7bc47d9bd9f7d638d" exitCode=0 Apr 21 14:56:20.654434 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:20.654363 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sldvt" event={"ID":"071f55a0-8b84-4aba-97bf-3b7856b4c800","Type":"ContainerDied","Data":"e2d0b6a00340027cb1c2df397ee8f0b4a22ec1ca64066fd7bc47d9bd9f7d638d"} Apr 21 14:56:20.668462 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:20.668443 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:56:20.678729 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:20.678694 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-th598" podStartSLOduration=7.94323809 podStartE2EDuration="26.678682364s" podCreationTimestamp="2026-04-21 14:55:54 +0000 UTC" firstStartedPulling="2026-04-21 14:55:55.677080509 +0000 UTC m=+1.812058113" lastFinishedPulling="2026-04-21 14:56:14.4125248 +0000 UTC m=+20.547502387" observedRunningTime="2026-04-21 14:56:20.677654676 +0000 UTC m=+26.812632282" watchObservedRunningTime="2026-04-21 14:56:20.678682364 +0000 UTC m=+26.813659973" Apr 21 14:56:21.446181 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:21.446000 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:21.446340 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:21.446000 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:21.446340 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:21.446281 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:56:21.446340 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:21.446328 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:56:21.576566 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:21.576486 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-d7bn9"] Apr 21 14:56:21.579414 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:21.579382 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zppc2"] Apr 21 14:56:21.579525 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:21.579511 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:21.579612 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:21.579594 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:56:21.579957 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:21.579937 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ktgkr"] Apr 21 14:56:21.658374 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:21.658340 2576 generic.go:358] "Generic (PLEG): container finished" podID="071f55a0-8b84-4aba-97bf-3b7856b4c800" containerID="43bbd8bddf91a4dacf9b7c602ad775d3002955e516c816a05391a5ac662944ff" exitCode=0 Apr 21 14:56:21.658542 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:21.658413 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:21.658542 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:21.658449 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sldvt" event={"ID":"071f55a0-8b84-4aba-97bf-3b7856b4c800","Type":"ContainerDied","Data":"43bbd8bddf91a4dacf9b7c602ad775d3002955e516c816a05391a5ac662944ff"} Apr 21 14:56:21.658542 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:21.658502 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:56:21.658934 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:21.658910 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:56:21.659059 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:21.658949 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:21.659100 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:21.659062 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:56:21.674230 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:21.674210 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:56:22.662302 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:22.662212 2576 generic.go:358] "Generic (PLEG): container finished" podID="071f55a0-8b84-4aba-97bf-3b7856b4c800" containerID="2360182772fc09c1e256f8b1f18e51d36919b1c5f01f34a5896b16ce7c42a9bc" exitCode=0 Apr 21 14:56:22.662302 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:22.662280 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sldvt" event={"ID":"071f55a0-8b84-4aba-97bf-3b7856b4c800","Type":"ContainerDied","Data":"2360182772fc09c1e256f8b1f18e51d36919b1c5f01f34a5896b16ce7c42a9bc"} Apr 21 14:56:23.446196 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:23.446118 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:23.446196 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:23.446145 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:23.446196 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:23.446148 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:23.446456 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:23.446226 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:56:23.446456 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:23.446343 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:56:23.446456 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:23.446420 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:56:25.446042 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:25.445953 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:25.446042 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:25.445953 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:25.446042 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:25.445957 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:25.446739 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:25.446081 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d7bn9" podUID="04bb98f2-3b25-4aa5-aa5b-4484506ce286" Apr 21 14:56:25.446739 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:25.446207 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:56:25.446739 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:25.446292 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zppc2" podUID="cab0d6b8-24b3-4ee8-bf59-c526df4af70b" Apr 21 14:56:27.163734 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.163646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nprfg\" (UniqueName: \"kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg\") pod \"network-check-target-zppc2\" (UID: \"cab0d6b8-24b3-4ee8-bf59-c526df4af70b\") " pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:27.163734 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.163715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs\") pod \"network-metrics-daemon-ktgkr\" (UID: \"f531f65c-d73f-48df-b4b9-fffda9589a9e\") " pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:27.164325 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.163773 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:56:27.164325 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.163798 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:56:27.164325 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.163810 2576 projected.go:194] Error preparing data for projected volume kube-api-access-nprfg for pod openshift-network-diagnostics/network-check-target-zppc2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:27.164325 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.163818 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:27.164325 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.163873 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg podName:cab0d6b8-24b3-4ee8-bf59-c526df4af70b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:59.163854741 +0000 UTC m=+65.298832352 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-nprfg" (UniqueName: "kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg") pod "network-check-target-zppc2" (UID: "cab0d6b8-24b3-4ee8-bf59-c526df4af70b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:27.164325 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.163889 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs podName:f531f65c-d73f-48df-b4b9-fffda9589a9e nodeName:}" failed. No retries permitted until 2026-04-21 14:56:59.163881848 +0000 UTC m=+65.298859435 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs") pod "network-metrics-daemon-ktgkr" (UID: "f531f65c-d73f-48df-b4b9-fffda9589a9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:27.220986 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.220953 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-121.ec2.internal" event="NodeReady" Apr 21 14:56:27.221157 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.221111 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 14:56:27.255831 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.255798 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5df9bb698f-sqrff"] Apr 21 14:56:27.277725 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.277695 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vbmd9"] Apr 21 14:56:27.277887 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.277825 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.279912 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.279885 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 14:56:27.280031 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.279916 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-47f52\"" Apr 21 14:56:27.280031 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.279890 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 14:56:27.280271 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.280228 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 14:56:27.293705 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.293680 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 14:56:27.294017 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.293998 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-s7jz7"] Apr 21 14:56:27.294175 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.294146 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:27.296405 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.296003 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 14:56:27.296405 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.296224 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 14:56:27.296405 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.296308 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5sqkw\"" Apr 21 14:56:27.311810 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.311786 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5df9bb698f-sqrff"] Apr 21 14:56:27.311810 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.311813 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vbmd9"] Apr 21 14:56:27.311979 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.311822 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s7jz7"] Apr 21 14:56:27.311979 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.311914 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:56:27.313885 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.313866 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 14:56:27.313996 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.313883 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 14:56:27.313996 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.313907 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 14:56:27.313996 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.313979 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7xfsn\"" Apr 21 14:56:27.364479 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.364443 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhh4c\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-kube-api-access-jhh4c\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.364479 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.364484 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:27.364737 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.364511 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9033c602-82a6-4c53-b7df-ea48a7be061b-ca-trust-extracted\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.364737 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.364553 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vl99\" (UniqueName: \"kubernetes.io/projected/1ada8e2a-356e-4899-913a-b055b92852e4-kube-api-access-6vl99\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:27.364737 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.364636 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9033c602-82a6-4c53-b7df-ea48a7be061b-trusted-ca\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.364737 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.364680 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9033c602-82a6-4c53-b7df-ea48a7be061b-installation-pull-secrets\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.364737 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.364713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.364737 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.364730 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-certificates\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.364968 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.364753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1ada8e2a-356e-4899-913a-b055b92852e4-tmp-dir\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:27.364968 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.364778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9033c602-82a6-4c53-b7df-ea48a7be061b-image-registry-private-configuration\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.364968 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.364807 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ada8e2a-356e-4899-913a-b055b92852e4-config-volume\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:27.364968 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.364879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-bound-sa-token\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.446075 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.445994 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:27.446257 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.446005 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:27.446257 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.445999 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:27.448433 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.448407 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 14:56:27.448547 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.448509 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 14:56:27.448547 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.448539 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 14:56:27.448673 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.448643 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cb4hj\"" Apr 21 14:56:27.448839 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.448823 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 14:56:27.448914 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.448864 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sdz22\"" Apr 21 14:56:27.466040 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9033c602-82a6-4c53-b7df-ea48a7be061b-trusted-ca\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.466171 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466048 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9033c602-82a6-4c53-b7df-ea48a7be061b-installation-pull-secrets\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.466171 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.466171 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466097 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-certificates\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.466386 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.466200 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:56:27.466386 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.466219 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5df9bb698f-sqrff: secret "image-registry-tls" not found Apr 21 14:56:27.466386 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.466297 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls podName:9033c602-82a6-4c53-b7df-ea48a7be061b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:27.966275617 +0000 UTC m=+34.101253218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls") pod "image-registry-5df9bb698f-sqrff" (UID: "9033c602-82a6-4c53-b7df-ea48a7be061b") : secret "image-registry-tls" not found Apr 21 14:56:27.466386 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp24w\" (UniqueName: \"kubernetes.io/projected/2ee3aa35-2266-4471-8170-7e506d7cd358-kube-api-access-zp24w\") pod \"ingress-canary-s7jz7\" (UID: \"2ee3aa35-2266-4471-8170-7e506d7cd358\") " pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:56:27.466612 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466422 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1ada8e2a-356e-4899-913a-b055b92852e4-tmp-dir\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:27.466612 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466467 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9033c602-82a6-4c53-b7df-ea48a7be061b-image-registry-private-configuration\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.466612 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ada8e2a-356e-4899-913a-b055b92852e4-config-volume\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:27.466612 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466574 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert\") pod \"ingress-canary-s7jz7\" (UID: \"2ee3aa35-2266-4471-8170-7e506d7cd358\") " pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:56:27.466839 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-bound-sa-token\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.466839 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhh4c\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-kube-api-access-jhh4c\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.466839 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:27.466839 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9033c602-82a6-4c53-b7df-ea48a7be061b-ca-trust-extracted\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.466839 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vl99\" (UniqueName: \"kubernetes.io/projected/1ada8e2a-356e-4899-913a-b055b92852e4-kube-api-access-6vl99\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:27.466839 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466777 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-certificates\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.467101 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.466856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1ada8e2a-356e-4899-913a-b055b92852e4-tmp-dir\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:27.467101 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.466942 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:27.467101 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.466997 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls podName:1ada8e2a-356e-4899-913a-b055b92852e4 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:27.966980518 +0000 UTC m=+34.101958104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls") pod "dns-default-vbmd9" (UID: "1ada8e2a-356e-4899-913a-b055b92852e4") : secret "dns-default-metrics-tls" not found Apr 21 14:56:27.467101 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.467048 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9033c602-82a6-4c53-b7df-ea48a7be061b-trusted-ca\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.467446 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.467180 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9033c602-82a6-4c53-b7df-ea48a7be061b-ca-trust-extracted\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.467446 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.467382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ada8e2a-356e-4899-913a-b055b92852e4-config-volume\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:27.471701 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.471568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9033c602-82a6-4c53-b7df-ea48a7be061b-installation-pull-secrets\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.472939 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.472920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9033c602-82a6-4c53-b7df-ea48a7be061b-image-registry-private-configuration\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.476182 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.476161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vl99\" (UniqueName: \"kubernetes.io/projected/1ada8e2a-356e-4899-913a-b055b92852e4-kube-api-access-6vl99\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:27.482023 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.481998 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhh4c\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-kube-api-access-jhh4c\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.482117 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.482039 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-bound-sa-token\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.567254 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.567192 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zp24w\" (UniqueName: \"kubernetes.io/projected/2ee3aa35-2266-4471-8170-7e506d7cd358-kube-api-access-zp24w\") pod \"ingress-canary-s7jz7\" (UID: \"2ee3aa35-2266-4471-8170-7e506d7cd358\") " pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:56:27.567503 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.567474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert\") pod \"ingress-canary-s7jz7\" (UID: \"2ee3aa35-2266-4471-8170-7e506d7cd358\") " pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:56:27.567659 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.567641 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:27.567764 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.567726 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert podName:2ee3aa35-2266-4471-8170-7e506d7cd358 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:28.067705076 +0000 UTC m=+34.202682666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert") pod "ingress-canary-s7jz7" (UID: "2ee3aa35-2266-4471-8170-7e506d7cd358") : secret "canary-serving-cert" not found Apr 21 14:56:27.578914 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.578880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp24w\" (UniqueName: \"kubernetes.io/projected/2ee3aa35-2266-4471-8170-7e506d7cd358-kube-api-access-zp24w\") pod \"ingress-canary-s7jz7\" (UID: \"2ee3aa35-2266-4471-8170-7e506d7cd358\") " pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:56:27.971157 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.971114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:27.971366 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:27.971202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:27.971366 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.971296 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:27.971366 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.971333 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:56:27.971366 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.971347 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5df9bb698f-sqrff: secret "image-registry-tls" not found Apr 21 14:56:27.971513 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.971378 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls podName:1ada8e2a-356e-4899-913a-b055b92852e4 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:28.971359692 +0000 UTC m=+35.106337292 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls") pod "dns-default-vbmd9" (UID: "1ada8e2a-356e-4899-913a-b055b92852e4") : secret "dns-default-metrics-tls" not found Apr 21 14:56:27.971513 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:27.971404 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls podName:9033c602-82a6-4c53-b7df-ea48a7be061b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:28.97138829 +0000 UTC m=+35.106365887 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls") pod "image-registry-5df9bb698f-sqrff" (UID: "9033c602-82a6-4c53-b7df-ea48a7be061b") : secret "image-registry-tls" not found Apr 21 14:56:28.072187 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:28.072146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert\") pod \"ingress-canary-s7jz7\" (UID: \"2ee3aa35-2266-4471-8170-7e506d7cd358\") " pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:56:28.072388 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:28.072308 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:28.072388 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:28.072367 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert podName:2ee3aa35-2266-4471-8170-7e506d7cd358 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:29.072352843 +0000 UTC m=+35.207330430 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert") pod "ingress-canary-s7jz7" (UID: "2ee3aa35-2266-4471-8170-7e506d7cd358") : secret "canary-serving-cert" not found Apr 21 14:56:28.980193 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:28.980153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:28.980638 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:28.980219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:28.980638 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:28.980344 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:56:28.980638 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:28.980358 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5df9bb698f-sqrff: secret "image-registry-tls" not found Apr 21 14:56:28.980638 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:28.980361 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:28.980638 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:28.980421 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls podName:9033c602-82a6-4c53-b7df-ea48a7be061b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:30.980401068 +0000 UTC m=+37.115378845 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls") pod "image-registry-5df9bb698f-sqrff" (UID: "9033c602-82a6-4c53-b7df-ea48a7be061b") : secret "image-registry-tls" not found Apr 21 14:56:28.980638 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:28.980436 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls podName:1ada8e2a-356e-4899-913a-b055b92852e4 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:30.980430438 +0000 UTC m=+37.115408026 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls") pod "dns-default-vbmd9" (UID: "1ada8e2a-356e-4899-913a-b055b92852e4") : secret "dns-default-metrics-tls" not found Apr 21 14:56:29.080937 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:29.080893 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert\") pod \"ingress-canary-s7jz7\" (UID: \"2ee3aa35-2266-4471-8170-7e506d7cd358\") " pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:56:29.081085 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:29.081033 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:29.081125 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:29.081097 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert podName:2ee3aa35-2266-4471-8170-7e506d7cd358 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:31.081081773 +0000 UTC m=+37.216059360 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert") pod "ingress-canary-s7jz7" (UID: "2ee3aa35-2266-4471-8170-7e506d7cd358") : secret "canary-serving-cert" not found Apr 21 14:56:29.678939 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:29.678909 2576 generic.go:358] "Generic (PLEG): container finished" podID="071f55a0-8b84-4aba-97bf-3b7856b4c800" containerID="0f5866da888088d44384f5dfbcc4562001f588edeb2cae9ca9f0333256025c18" exitCode=0 Apr 21 14:56:29.679096 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:29.678964 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sldvt" event={"ID":"071f55a0-8b84-4aba-97bf-3b7856b4c800","Type":"ContainerDied","Data":"0f5866da888088d44384f5dfbcc4562001f588edeb2cae9ca9f0333256025c18"} Apr 21 14:56:29.887875 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:29.887844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret\") pod \"global-pull-secret-syncer-d7bn9\" (UID: \"04bb98f2-3b25-4aa5-aa5b-4484506ce286\") " pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:29.899889 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:29.899854 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04bb98f2-3b25-4aa5-aa5b-4484506ce286-original-pull-secret\") pod \"global-pull-secret-syncer-d7bn9\" (UID: \"04bb98f2-3b25-4aa5-aa5b-4484506ce286\") " pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:30.157929 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:30.157882 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d7bn9" Apr 21 14:56:30.298669 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:30.298639 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-d7bn9"] Apr 21 14:56:30.302550 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:56:30.302510 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04bb98f2_3b25_4aa5_aa5b_4484506ce286.slice/crio-8c646264060cf8bc659999fcb3a097fccfa8809de3fd867843f4860d05a74949 WatchSource:0}: Error finding container 8c646264060cf8bc659999fcb3a097fccfa8809de3fd867843f4860d05a74949: Status 404 returned error can't find the container with id 8c646264060cf8bc659999fcb3a097fccfa8809de3fd867843f4860d05a74949 Apr 21 14:56:30.683073 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:30.683042 2576 generic.go:358] "Generic (PLEG): container finished" podID="071f55a0-8b84-4aba-97bf-3b7856b4c800" containerID="6dea2a79b49502f26d9aaee5756921c088a41a7ce898a7e00911220ae119fa32" exitCode=0 Apr 21 14:56:30.683270 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:30.683130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sldvt" event={"ID":"071f55a0-8b84-4aba-97bf-3b7856b4c800","Type":"ContainerDied","Data":"6dea2a79b49502f26d9aaee5756921c088a41a7ce898a7e00911220ae119fa32"} Apr 21 14:56:30.684129 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:30.684045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-d7bn9" event={"ID":"04bb98f2-3b25-4aa5-aa5b-4484506ce286","Type":"ContainerStarted","Data":"8c646264060cf8bc659999fcb3a097fccfa8809de3fd867843f4860d05a74949"} Apr 21 14:56:30.997692 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:30.997594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:30.997853 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:30.997693 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:30.997853 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:30.997706 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:30.997853 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:30.997777 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls podName:1ada8e2a-356e-4899-913a-b055b92852e4 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:34.99775576 +0000 UTC m=+41.132733347 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls") pod "dns-default-vbmd9" (UID: "1ada8e2a-356e-4899-913a-b055b92852e4") : secret "dns-default-metrics-tls" not found Apr 21 14:56:30.997853 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:30.997805 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:56:30.997853 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:30.997820 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5df9bb698f-sqrff: secret "image-registry-tls" not found Apr 21 14:56:30.998078 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:30.997875 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls podName:9033c602-82a6-4c53-b7df-ea48a7be061b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:34.997859665 +0000 UTC m=+41.132837257 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls") pod "image-registry-5df9bb698f-sqrff" (UID: "9033c602-82a6-4c53-b7df-ea48a7be061b") : secret "image-registry-tls" not found Apr 21 14:56:31.098645 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:31.098606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert\") pod \"ingress-canary-s7jz7\" (UID: \"2ee3aa35-2266-4471-8170-7e506d7cd358\") " pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:56:31.098828 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:31.098734 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:31.098828 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:31.098802 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert podName:2ee3aa35-2266-4471-8170-7e506d7cd358 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:35.098784523 +0000 UTC m=+41.233762135 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert") pod "ingress-canary-s7jz7" (UID: "2ee3aa35-2266-4471-8170-7e506d7cd358") : secret "canary-serving-cert" not found Apr 21 14:56:31.689555 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:31.689520 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sldvt" event={"ID":"071f55a0-8b84-4aba-97bf-3b7856b4c800","Type":"ContainerStarted","Data":"a46513d4251acce9134e43a3a058076773d27b0f7783fc518ac0eac3577de9ae"} Apr 21 14:56:31.711427 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:31.710387 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sldvt" podStartSLOduration=4.706788968 podStartE2EDuration="37.710367432s" podCreationTimestamp="2026-04-21 14:55:54 +0000 UTC" firstStartedPulling="2026-04-21 14:55:55.738142245 +0000 UTC m=+1.873119833" lastFinishedPulling="2026-04-21 14:56:28.741720701 +0000 UTC m=+34.876698297" observedRunningTime="2026-04-21 14:56:31.709734832 +0000 UTC m=+37.844712441" watchObservedRunningTime="2026-04-21 14:56:31.710367432 +0000 UTC m=+37.845345053" Apr 21 14:56:34.697723 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:34.697638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-d7bn9" event={"ID":"04bb98f2-3b25-4aa5-aa5b-4484506ce286","Type":"ContainerStarted","Data":"c704c9bab3b7825bc79dbc504b504308dd1cac1dc7d37f1b0ee45f0dfb55af11"} Apr 21 14:56:35.031845 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:35.031814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:35.031997 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:35.031909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:35.031997 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:35.031970 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:56:35.031997 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:35.031989 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5df9bb698f-sqrff: secret "image-registry-tls" not found Apr 21 14:56:35.032096 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:35.032003 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:35.032096 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:35.032053 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls podName:9033c602-82a6-4c53-b7df-ea48a7be061b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:43.032033589 +0000 UTC m=+49.167011231 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls") pod "image-registry-5df9bb698f-sqrff" (UID: "9033c602-82a6-4c53-b7df-ea48a7be061b") : secret "image-registry-tls" not found Apr 21 14:56:35.032096 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:35.032068 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls podName:1ada8e2a-356e-4899-913a-b055b92852e4 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:43.032060904 +0000 UTC m=+49.167038492 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls") pod "dns-default-vbmd9" (UID: "1ada8e2a-356e-4899-913a-b055b92852e4") : secret "dns-default-metrics-tls" not found Apr 21 14:56:35.132451 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:35.132413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert\") pod \"ingress-canary-s7jz7\" (UID: \"2ee3aa35-2266-4471-8170-7e506d7cd358\") " pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:56:35.132615 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:35.132527 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:35.132615 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:35.132578 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert podName:2ee3aa35-2266-4471-8170-7e506d7cd358 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:43.13256353 +0000 UTC m=+49.267541131 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert") pod "ingress-canary-s7jz7" (UID: "2ee3aa35-2266-4471-8170-7e506d7cd358") : secret "canary-serving-cert" not found Apr 21 14:56:43.092498 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:43.092459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:43.092910 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:43.092532 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:43.092910 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:43.092626 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:43.092910 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:43.092635 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:56:43.092910 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:43.092656 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5df9bb698f-sqrff: secret "image-registry-tls" not found Apr 21 14:56:43.092910 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:43.092689 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls podName:1ada8e2a-356e-4899-913a-b055b92852e4 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:59.092674261 +0000 UTC m=+65.227651848 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls") pod "dns-default-vbmd9" (UID: "1ada8e2a-356e-4899-913a-b055b92852e4") : secret "dns-default-metrics-tls" not found Apr 21 14:56:43.092910 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:43.092708 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls podName:9033c602-82a6-4c53-b7df-ea48a7be061b nodeName:}" failed. No retries permitted until 2026-04-21 14:56:59.092693932 +0000 UTC m=+65.227671520 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls") pod "image-registry-5df9bb698f-sqrff" (UID: "9033c602-82a6-4c53-b7df-ea48a7be061b") : secret "image-registry-tls" not found Apr 21 14:56:43.193370 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:43.193335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert\") pod \"ingress-canary-s7jz7\" (UID: \"2ee3aa35-2266-4471-8170-7e506d7cd358\") " pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:56:43.193507 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:43.193446 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:43.193507 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:43.193497 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert podName:2ee3aa35-2266-4471-8170-7e506d7cd358 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:59.193482535 +0000 UTC m=+65.328460122 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert") pod "ingress-canary-s7jz7" (UID: "2ee3aa35-2266-4471-8170-7e506d7cd358") : secret "canary-serving-cert" not found Apr 21 14:56:53.675059 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:53.675033 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-th598" Apr 21 14:56:53.700609 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:53.700566 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-d7bn9" podStartSLOduration=51.719316524 podStartE2EDuration="55.700552344s" podCreationTimestamp="2026-04-21 14:55:58 +0000 UTC" firstStartedPulling="2026-04-21 14:56:30.304216803 +0000 UTC m=+36.439194390" lastFinishedPulling="2026-04-21 14:56:34.285452608 +0000 UTC m=+40.420430210" observedRunningTime="2026-04-21 14:56:34.712044682 +0000 UTC m=+40.847022291" watchObservedRunningTime="2026-04-21 14:56:53.700552344 +0000 UTC m=+59.835529955" Apr 21 14:56:55.863296 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.863262 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b"] Apr 21 14:56:55.866866 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.866847 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" Apr 21 14:56:55.869469 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.869447 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 14:56:55.869581 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.869501 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 14:56:55.869581 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.869514 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 14:56:55.869581 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.869504 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 14:56:55.873444 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.873423 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b"] Apr 21 14:56:55.881505 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.881475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5tv9\" (UniqueName: \"kubernetes.io/projected/8788c833-0d83-4777-88b4-c2a44875fcb1-kube-api-access-w5tv9\") pod \"klusterlet-addon-workmgr-85747f4d65-p4w4b\" (UID: \"8788c833-0d83-4777-88b4-c2a44875fcb1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" Apr 21 14:56:55.881600 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.881548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8788c833-0d83-4777-88b4-c2a44875fcb1-tmp\") pod \"klusterlet-addon-workmgr-85747f4d65-p4w4b\" (UID: \"8788c833-0d83-4777-88b4-c2a44875fcb1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" Apr 21 14:56:55.881659 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.881603 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8788c833-0d83-4777-88b4-c2a44875fcb1-klusterlet-config\") pod \"klusterlet-addon-workmgr-85747f4d65-p4w4b\" (UID: \"8788c833-0d83-4777-88b4-c2a44875fcb1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" Apr 21 14:56:55.982425 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.982391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8788c833-0d83-4777-88b4-c2a44875fcb1-tmp\") pod \"klusterlet-addon-workmgr-85747f4d65-p4w4b\" (UID: \"8788c833-0d83-4777-88b4-c2a44875fcb1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" Apr 21 14:56:55.982604 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.982475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8788c833-0d83-4777-88b4-c2a44875fcb1-klusterlet-config\") pod \"klusterlet-addon-workmgr-85747f4d65-p4w4b\" (UID: \"8788c833-0d83-4777-88b4-c2a44875fcb1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" Apr 21 14:56:55.982604 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.982522 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5tv9\" (UniqueName: \"kubernetes.io/projected/8788c833-0d83-4777-88b4-c2a44875fcb1-kube-api-access-w5tv9\") pod \"klusterlet-addon-workmgr-85747f4d65-p4w4b\" (UID: \"8788c833-0d83-4777-88b4-c2a44875fcb1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" Apr 21 14:56:55.982916 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.982879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8788c833-0d83-4777-88b4-c2a44875fcb1-tmp\") pod \"klusterlet-addon-workmgr-85747f4d65-p4w4b\" (UID: \"8788c833-0d83-4777-88b4-c2a44875fcb1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" Apr 21 14:56:55.985982 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.985957 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8788c833-0d83-4777-88b4-c2a44875fcb1-klusterlet-config\") pod \"klusterlet-addon-workmgr-85747f4d65-p4w4b\" (UID: \"8788c833-0d83-4777-88b4-c2a44875fcb1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" Apr 21 14:56:55.991336 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:55.991310 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5tv9\" (UniqueName: \"kubernetes.io/projected/8788c833-0d83-4777-88b4-c2a44875fcb1-kube-api-access-w5tv9\") pod \"klusterlet-addon-workmgr-85747f4d65-p4w4b\" (UID: \"8788c833-0d83-4777-88b4-c2a44875fcb1\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" Apr 21 14:56:56.176644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:56.176544 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" Apr 21 14:56:56.304332 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:56.304299 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b"] Apr 21 14:56:56.307732 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:56:56.307701 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8788c833_0d83_4777_88b4_c2a44875fcb1.slice/crio-1cf368405c46ab57e826dc1fc74f3a1f56365256d037e5bc48dd06d33d07f423 WatchSource:0}: Error finding container 1cf368405c46ab57e826dc1fc74f3a1f56365256d037e5bc48dd06d33d07f423: Status 404 returned error can't find the container with id 1cf368405c46ab57e826dc1fc74f3a1f56365256d037e5bc48dd06d33d07f423 Apr 21 14:56:56.740262 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:56.740201 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" event={"ID":"8788c833-0d83-4777-88b4-c2a44875fcb1","Type":"ContainerStarted","Data":"1cf368405c46ab57e826dc1fc74f3a1f56365256d037e5bc48dd06d33d07f423"} Apr 21 14:56:59.105159 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:59.105114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:56:59.105646 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:59.105187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:56:59.105646 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:59.105300 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:59.105646 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:59.105339 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:56:59.105646 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:59.105356 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5df9bb698f-sqrff: secret "image-registry-tls" not found Apr 21 14:56:59.105646 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:59.105379 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls podName:1ada8e2a-356e-4899-913a-b055b92852e4 nodeName:}" failed. No retries permitted until 2026-04-21 14:57:31.105361658 +0000 UTC m=+97.240339258 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls") pod "dns-default-vbmd9" (UID: "1ada8e2a-356e-4899-913a-b055b92852e4") : secret "dns-default-metrics-tls" not found Apr 21 14:56:59.105646 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:59.105399 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls podName:9033c602-82a6-4c53-b7df-ea48a7be061b nodeName:}" failed. No retries permitted until 2026-04-21 14:57:31.105386492 +0000 UTC m=+97.240364083 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls") pod "image-registry-5df9bb698f-sqrff" (UID: "9033c602-82a6-4c53-b7df-ea48a7be061b") : secret "image-registry-tls" not found Apr 21 14:56:59.206352 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:59.206306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nprfg\" (UniqueName: \"kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg\") pod \"network-check-target-zppc2\" (UID: \"cab0d6b8-24b3-4ee8-bf59-c526df4af70b\") " pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:59.206543 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:59.206407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs\") pod \"network-metrics-daemon-ktgkr\" (UID: \"f531f65c-d73f-48df-b4b9-fffda9589a9e\") " pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:56:59.206543 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:59.206440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert\") pod \"ingress-canary-s7jz7\" (UID: \"2ee3aa35-2266-4471-8170-7e506d7cd358\") " pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:56:59.206668 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:59.206556 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:59.206668 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:59.206625 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert podName:2ee3aa35-2266-4471-8170-7e506d7cd358 nodeName:}" failed. No retries permitted until 2026-04-21 14:57:31.2066028 +0000 UTC m=+97.341580430 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert") pod "ingress-canary-s7jz7" (UID: "2ee3aa35-2266-4471-8170-7e506d7cd358") : secret "canary-serving-cert" not found Apr 21 14:56:59.208542 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:59.208516 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 14:56:59.208718 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:59.208692 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 14:56:59.217068 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:59.216992 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 14:56:59.217068 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:56:59.217068 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs podName:f531f65c-d73f-48df-b4b9-fffda9589a9e nodeName:}" failed. No retries permitted until 2026-04-21 14:58:03.217042816 +0000 UTC m=+129.352020405 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs") pod "network-metrics-daemon-ktgkr" (UID: "f531f65c-d73f-48df-b4b9-fffda9589a9e") : secret "metrics-daemon-secret" not found Apr 21 14:56:59.220155 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:59.220129 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 14:56:59.230045 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:59.230020 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nprfg\" (UniqueName: \"kubernetes.io/projected/cab0d6b8-24b3-4ee8-bf59-c526df4af70b-kube-api-access-nprfg\") pod \"network-check-target-zppc2\" (UID: \"cab0d6b8-24b3-4ee8-bf59-c526df4af70b\") " pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:56:59.272337 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:59.272303 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sdz22\"" Apr 21 14:56:59.280647 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:56:59.280623 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:57:00.276271 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:00.276224 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zppc2"] Apr 21 14:57:00.279880 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:57:00.279855 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcab0d6b8_24b3_4ee8_bf59_c526df4af70b.slice/crio-1562c9327edc6b279117d7ac60b9f43b005a6d0caad569ea734885e6bd508a37 WatchSource:0}: Error finding container 1562c9327edc6b279117d7ac60b9f43b005a6d0caad569ea734885e6bd508a37: Status 404 returned error can't find the container with id 1562c9327edc6b279117d7ac60b9f43b005a6d0caad569ea734885e6bd508a37 Apr 21 14:57:00.750817 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:00.750769 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" event={"ID":"8788c833-0d83-4777-88b4-c2a44875fcb1","Type":"ContainerStarted","Data":"24032856029b3a4019e76a567506a6ea1994887216ff63d099674db2a7c6282e"} Apr 21 14:57:00.751079 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:00.751052 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" Apr 21 14:57:00.751969 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:00.751940 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zppc2" event={"ID":"cab0d6b8-24b3-4ee8-bf59-c526df4af70b","Type":"ContainerStarted","Data":"1562c9327edc6b279117d7ac60b9f43b005a6d0caad569ea734885e6bd508a37"} Apr 21 14:57:00.752706 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:00.752664 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" Apr 21 14:57:00.766046 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:00.765993 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85747f4d65-p4w4b" podStartSLOduration=1.491003751 podStartE2EDuration="5.765978236s" podCreationTimestamp="2026-04-21 14:56:55 +0000 UTC" firstStartedPulling="2026-04-21 14:56:56.309447454 +0000 UTC m=+62.444425041" lastFinishedPulling="2026-04-21 14:57:00.584421923 +0000 UTC m=+66.719399526" observedRunningTime="2026-04-21 14:57:00.765009192 +0000 UTC m=+66.899986814" watchObservedRunningTime="2026-04-21 14:57:00.765978236 +0000 UTC m=+66.900955846" Apr 21 14:57:03.760636 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:03.760602 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zppc2" event={"ID":"cab0d6b8-24b3-4ee8-bf59-c526df4af70b","Type":"ContainerStarted","Data":"3e67751ffba21ec7ed9bc6c1fd27d587ab6219b3177bec898a19ae7bfe1d5e0c"} Apr 21 14:57:03.761045 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:03.760742 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:57:03.775907 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:03.775855 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zppc2" podStartSLOduration=67.099516518 podStartE2EDuration="1m9.775841205s" podCreationTimestamp="2026-04-21 14:55:54 +0000 UTC" firstStartedPulling="2026-04-21 14:57:00.282154287 +0000 UTC m=+66.417131874" lastFinishedPulling="2026-04-21 14:57:02.958478973 +0000 UTC m=+69.093456561" observedRunningTime="2026-04-21 14:57:03.775328266 +0000 UTC m=+69.910305873" watchObservedRunningTime="2026-04-21 14:57:03.775841205 +0000 UTC m=+69.910818813" Apr 21 14:57:31.140506 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:31.140475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:57:31.140835 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:31.140522 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls\") pod \"image-registry-5df9bb698f-sqrff\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:57:31.140835 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:57:31.140616 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 14:57:31.140835 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:57:31.140618 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:57:31.140835 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:57:31.140626 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5df9bb698f-sqrff: secret "image-registry-tls" not found Apr 21 14:57:31.140835 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:57:31.140717 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls podName:1ada8e2a-356e-4899-913a-b055b92852e4 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:35.140697121 +0000 UTC m=+161.275674709 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls") pod "dns-default-vbmd9" (UID: "1ada8e2a-356e-4899-913a-b055b92852e4") : secret "dns-default-metrics-tls" not found Apr 21 14:57:31.140835 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:57:31.140745 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls podName:9033c602-82a6-4c53-b7df-ea48a7be061b nodeName:}" failed. No retries permitted until 2026-04-21 14:58:35.140734092 +0000 UTC m=+161.275711680 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls") pod "image-registry-5df9bb698f-sqrff" (UID: "9033c602-82a6-4c53-b7df-ea48a7be061b") : secret "image-registry-tls" not found Apr 21 14:57:31.241122 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:31.241079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert\") pod \"ingress-canary-s7jz7\" (UID: \"2ee3aa35-2266-4471-8170-7e506d7cd358\") " pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:57:31.241287 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:57:31.241223 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:57:31.241332 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:57:31.241301 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert podName:2ee3aa35-2266-4471-8170-7e506d7cd358 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:35.241284225 +0000 UTC m=+161.376261812 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert") pod "ingress-canary-s7jz7" (UID: "2ee3aa35-2266-4471-8170-7e506d7cd358") : secret "canary-serving-cert" not found Apr 21 14:57:34.765016 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:34.764970 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zppc2" Apr 21 14:57:49.161823 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:49.161793 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kpdvd_dadbd785-1d07-45b6-868c-c95e20421c54/dns-node-resolver/0.log" Apr 21 14:57:49.961354 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:49.961327 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7h2f9_297ac21d-4aa7-488f-8f40-48d7b969036b/node-ca/0.log" Apr 21 14:57:53.997697 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:53.997659 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p"] Apr 21 14:57:54.000443 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.000427 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" Apr 21 14:57:54.003696 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.003671 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:57:54.003925 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.003907 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 14:57:54.004168 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.004147 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 14:57:54.004282 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.004150 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-4w9p2\"" Apr 21 14:57:54.004282 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.004154 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 14:57:54.017541 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.017516 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p"] Apr 21 14:57:54.094967 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.094937 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc"] Apr 21 14:57:54.097786 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.097771 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s"] Apr 21 14:57:54.097943 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.097928 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" Apr 21 14:57:54.099701 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.099680 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 14:57:54.099898 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.099881 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:57:54.099948 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.099914 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-dmzg2\"" Apr 21 14:57:54.099948 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.099932 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 14:57:54.100233 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.100218 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 14:57:54.100703 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.100688 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" Apr 21 14:57:54.102817 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.102799 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-d9278\"" Apr 21 14:57:54.103045 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.103030 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 14:57:54.103136 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.103073 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:57:54.103196 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.103136 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 14:57:54.108446 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.108425 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s"] Apr 21 14:57:54.109136 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.109117 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc"] Apr 21 14:57:54.113773 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.113754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba080c2-51f9-47d4-8e01-bccc93ada876-config\") pod \"service-ca-operator-d6fc45fc5-l956p\" (UID: \"fba080c2-51f9-47d4-8e01-bccc93ada876\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" Apr 21 14:57:54.113868 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.113779 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnnx6\" (UniqueName: \"kubernetes.io/projected/fba080c2-51f9-47d4-8e01-bccc93ada876-kube-api-access-fnnx6\") pod \"service-ca-operator-d6fc45fc5-l956p\" (UID: \"fba080c2-51f9-47d4-8e01-bccc93ada876\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" Apr 21 14:57:54.113868 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.113802 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fba080c2-51f9-47d4-8e01-bccc93ada876-serving-cert\") pod \"service-ca-operator-d6fc45fc5-l956p\" (UID: \"fba080c2-51f9-47d4-8e01-bccc93ada876\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" Apr 21 14:57:54.214740 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.214693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wvf6s\" (UID: \"85b14099-ff6f-42c5-a9e8-38d1e3153f05\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" Apr 21 14:57:54.214740 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.214734 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-t6zgc\" (UID: \"9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" Apr 21 14:57:54.215005 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.214861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba080c2-51f9-47d4-8e01-bccc93ada876-config\") pod \"service-ca-operator-d6fc45fc5-l956p\" (UID: \"fba080c2-51f9-47d4-8e01-bccc93ada876\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" Apr 21 14:57:54.215005 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.214895 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnnx6\" (UniqueName: \"kubernetes.io/projected/fba080c2-51f9-47d4-8e01-bccc93ada876-kube-api-access-fnnx6\") pod \"service-ca-operator-d6fc45fc5-l956p\" (UID: \"fba080c2-51f9-47d4-8e01-bccc93ada876\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" Apr 21 14:57:54.215005 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.214924 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mr4f\" (UniqueName: \"kubernetes.io/projected/85b14099-ff6f-42c5-a9e8-38d1e3153f05-kube-api-access-2mr4f\") pod \"cluster-samples-operator-6dc5bdb6b4-wvf6s\" (UID: \"85b14099-ff6f-42c5-a9e8-38d1e3153f05\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" Apr 21 14:57:54.215005 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.214950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml5xf\" (UniqueName: \"kubernetes.io/projected/9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c-kube-api-access-ml5xf\") pod \"kube-storage-version-migrator-operator-6769c5d45-t6zgc\" (UID: \"9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" Apr 21 14:57:54.215005 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.214983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fba080c2-51f9-47d4-8e01-bccc93ada876-serving-cert\") pod \"service-ca-operator-d6fc45fc5-l956p\" (UID: \"fba080c2-51f9-47d4-8e01-bccc93ada876\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" Apr 21 14:57:54.215270 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.215009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-t6zgc\" (UID: \"9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" Apr 21 14:57:54.215391 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.215374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba080c2-51f9-47d4-8e01-bccc93ada876-config\") pod \"service-ca-operator-d6fc45fc5-l956p\" (UID: \"fba080c2-51f9-47d4-8e01-bccc93ada876\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" Apr 21 14:57:54.217394 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.217369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fba080c2-51f9-47d4-8e01-bccc93ada876-serving-cert\") pod \"service-ca-operator-d6fc45fc5-l956p\" (UID: \"fba080c2-51f9-47d4-8e01-bccc93ada876\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" Apr 21 14:57:54.223287 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.223266 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnnx6\" (UniqueName: \"kubernetes.io/projected/fba080c2-51f9-47d4-8e01-bccc93ada876-kube-api-access-fnnx6\") pod \"service-ca-operator-d6fc45fc5-l956p\" (UID: \"fba080c2-51f9-47d4-8e01-bccc93ada876\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" Apr 21 14:57:54.310964 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.310935 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-4w9p2\"" Apr 21 14:57:54.315758 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.315741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mr4f\" (UniqueName: \"kubernetes.io/projected/85b14099-ff6f-42c5-a9e8-38d1e3153f05-kube-api-access-2mr4f\") pod \"cluster-samples-operator-6dc5bdb6b4-wvf6s\" (UID: \"85b14099-ff6f-42c5-a9e8-38d1e3153f05\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" Apr 21 14:57:54.315826 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.315767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ml5xf\" (UniqueName: \"kubernetes.io/projected/9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c-kube-api-access-ml5xf\") pod \"kube-storage-version-migrator-operator-6769c5d45-t6zgc\" (UID: \"9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" Apr 21 14:57:54.315826 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.315787 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-t6zgc\" (UID: \"9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" Apr 21 14:57:54.315963 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.315943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wvf6s\" (UID: \"85b14099-ff6f-42c5-a9e8-38d1e3153f05\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" Apr 21 14:57:54.316006 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.315978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-t6zgc\" (UID: \"9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" Apr 21 14:57:54.317801 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.317775 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 14:57:54.317801 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.317792 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 14:57:54.317956 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.317842 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 14:57:54.318889 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.318873 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" Apr 21 14:57:54.322848 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.322830 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 14:57:54.322939 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.322854 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 14:57:54.326210 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:57:54.326193 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 14:57:54.326323 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:57:54.326274 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls podName:85b14099-ff6f-42c5-a9e8-38d1e3153f05 nodeName:}" failed. No retries permitted until 2026-04-21 14:57:54.826251701 +0000 UTC m=+120.961229305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wvf6s" (UID: "85b14099-ff6f-42c5-a9e8-38d1e3153f05") : secret "samples-operator-tls" not found Apr 21 14:57:54.326846 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.326826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-t6zgc\" (UID: \"9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" Apr 21 14:57:54.332905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.328514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-t6zgc\" (UID: \"9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" Apr 21 14:57:54.334878 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.334855 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:57:54.334993 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.334877 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:57:54.346446 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.346418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml5xf\" (UniqueName: \"kubernetes.io/projected/9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c-kube-api-access-ml5xf\") pod \"kube-storage-version-migrator-operator-6769c5d45-t6zgc\" (UID: \"9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" Apr 21 14:57:54.346580 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.346504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mr4f\" (UniqueName: \"kubernetes.io/projected/85b14099-ff6f-42c5-a9e8-38d1e3153f05-kube-api-access-2mr4f\") pod \"cluster-samples-operator-6dc5bdb6b4-wvf6s\" (UID: \"85b14099-ff6f-42c5-a9e8-38d1e3153f05\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" Apr 21 14:57:54.414327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.414287 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-dmzg2\"" Apr 21 14:57:54.418268 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.418224 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" Apr 21 14:57:54.442795 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.442765 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p"] Apr 21 14:57:54.445465 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:57:54.445438 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfba080c2_51f9_47d4_8e01_bccc93ada876.slice/crio-401244bf47e717c2c6aa2add5e18e2cbd8369fe552a1d810f1d5ffddc98fb047 WatchSource:0}: Error finding container 401244bf47e717c2c6aa2add5e18e2cbd8369fe552a1d810f1d5ffddc98fb047: Status 404 returned error can't find the container with id 401244bf47e717c2c6aa2add5e18e2cbd8369fe552a1d810f1d5ffddc98fb047 Apr 21 14:57:54.535015 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.534985 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc"] Apr 21 14:57:54.538455 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:57:54.538428 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f8ea7ab_3b7d_4a56_ab37_533e373f1a5c.slice/crio-b74c9ccb38a0c079ebe5f0f3228d5a36fbf22dc2432a5267f59d2c6a39ab841a WatchSource:0}: Error finding container b74c9ccb38a0c079ebe5f0f3228d5a36fbf22dc2432a5267f59d2c6a39ab841a: Status 404 returned error can't find the container with id b74c9ccb38a0c079ebe5f0f3228d5a36fbf22dc2432a5267f59d2c6a39ab841a Apr 21 14:57:54.857433 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.857388 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" event={"ID":"fba080c2-51f9-47d4-8e01-bccc93ada876","Type":"ContainerStarted","Data":"401244bf47e717c2c6aa2add5e18e2cbd8369fe552a1d810f1d5ffddc98fb047"} Apr 21 14:57:54.858278 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.858253 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" event={"ID":"9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c","Type":"ContainerStarted","Data":"b74c9ccb38a0c079ebe5f0f3228d5a36fbf22dc2432a5267f59d2c6a39ab841a"} Apr 21 14:57:54.921533 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:54.921493 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wvf6s\" (UID: \"85b14099-ff6f-42c5-a9e8-38d1e3153f05\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" Apr 21 14:57:54.921695 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:57:54.921609 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 14:57:54.921695 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:57:54.921676 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls podName:85b14099-ff6f-42c5-a9e8-38d1e3153f05 nodeName:}" failed. No retries permitted until 2026-04-21 14:57:55.921657697 +0000 UTC m=+122.056635289 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wvf6s" (UID: "85b14099-ff6f-42c5-a9e8-38d1e3153f05") : secret "samples-operator-tls" not found Apr 21 14:57:55.930516 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:55.930477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wvf6s\" (UID: \"85b14099-ff6f-42c5-a9e8-38d1e3153f05\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" Apr 21 14:57:55.930991 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:57:55.930634 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 14:57:55.930991 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:57:55.930714 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls podName:85b14099-ff6f-42c5-a9e8-38d1e3153f05 nodeName:}" failed. No retries permitted until 2026-04-21 14:57:57.930697803 +0000 UTC m=+124.065675390 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wvf6s" (UID: "85b14099-ff6f-42c5-a9e8-38d1e3153f05") : secret "samples-operator-tls" not found Apr 21 14:57:57.865526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:57.865403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" event={"ID":"9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c","Type":"ContainerStarted","Data":"b042d0db1e8687e6c67a422c1d2db988805e2decc4a1a3d7a104dd5bc127de65"} Apr 21 14:57:57.868118 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:57.868080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" event={"ID":"fba080c2-51f9-47d4-8e01-bccc93ada876","Type":"ContainerStarted","Data":"79d434590d649ab904cdd8e86fb80622fde74536d3730d06e175fca05ba892cd"} Apr 21 14:57:57.882376 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:57.882336 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" podStartSLOduration=1.565812775 podStartE2EDuration="3.882323001s" podCreationTimestamp="2026-04-21 14:57:54 +0000 UTC" firstStartedPulling="2026-04-21 14:57:54.540208045 +0000 UTC m=+120.675185636" lastFinishedPulling="2026-04-21 14:57:56.856718276 +0000 UTC m=+122.991695862" observedRunningTime="2026-04-21 14:57:57.881999944 +0000 UTC m=+124.016977555" watchObservedRunningTime="2026-04-21 14:57:57.882323001 +0000 UTC m=+124.017300613" Apr 21 14:57:57.896793 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:57.896751 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" podStartSLOduration=2.484250741 podStartE2EDuration="4.896738441s" podCreationTimestamp="2026-04-21 14:57:53 +0000 UTC" firstStartedPulling="2026-04-21 14:57:54.447330802 +0000 UTC m=+120.582308388" lastFinishedPulling="2026-04-21 14:57:56.859818501 +0000 UTC m=+122.994796088" observedRunningTime="2026-04-21 14:57:57.896333956 +0000 UTC m=+124.031311568" watchObservedRunningTime="2026-04-21 14:57:57.896738441 +0000 UTC m=+124.031716050" Apr 21 14:57:57.950660 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:57:57.947165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wvf6s\" (UID: \"85b14099-ff6f-42c5-a9e8-38d1e3153f05\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" Apr 21 14:57:57.950660 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:57:57.947337 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 14:57:57.950660 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:57:57.947427 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls podName:85b14099-ff6f-42c5-a9e8-38d1e3153f05 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:01.947405674 +0000 UTC m=+128.082383275 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wvf6s" (UID: "85b14099-ff6f-42c5-a9e8-38d1e3153f05") : secret "samples-operator-tls" not found Apr 21 14:58:00.311919 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.311879 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5mvmh"] Apr 21 14:58:00.315135 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.315115 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-5mvmh" Apr 21 14:58:00.317191 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.317170 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 14:58:00.317660 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.317644 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 14:58:00.317660 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.317653 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-npmmn\"" Apr 21 14:58:00.317752 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.317737 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 14:58:00.317812 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.317737 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 14:58:00.320844 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.320827 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5mvmh"] Apr 21 14:58:00.466600 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.466570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/553528fc-9a9f-434b-a257-73aee39129bf-signing-cabundle\") pod \"service-ca-865cb79987-5mvmh\" (UID: \"553528fc-9a9f-434b-a257-73aee39129bf\") " pod="openshift-service-ca/service-ca-865cb79987-5mvmh" Apr 21 14:58:00.466768 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.466633 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/553528fc-9a9f-434b-a257-73aee39129bf-signing-key\") pod \"service-ca-865cb79987-5mvmh\" (UID: \"553528fc-9a9f-434b-a257-73aee39129bf\") " pod="openshift-service-ca/service-ca-865cb79987-5mvmh" Apr 21 14:58:00.466768 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.466699 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q92pt\" (UniqueName: \"kubernetes.io/projected/553528fc-9a9f-434b-a257-73aee39129bf-kube-api-access-q92pt\") pod \"service-ca-865cb79987-5mvmh\" (UID: \"553528fc-9a9f-434b-a257-73aee39129bf\") " pod="openshift-service-ca/service-ca-865cb79987-5mvmh" Apr 21 14:58:00.568070 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.567981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/553528fc-9a9f-434b-a257-73aee39129bf-signing-cabundle\") pod \"service-ca-865cb79987-5mvmh\" (UID: \"553528fc-9a9f-434b-a257-73aee39129bf\") " pod="openshift-service-ca/service-ca-865cb79987-5mvmh" Apr 21 14:58:00.568070 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.568038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/553528fc-9a9f-434b-a257-73aee39129bf-signing-key\") pod \"service-ca-865cb79987-5mvmh\" (UID: \"553528fc-9a9f-434b-a257-73aee39129bf\") " pod="openshift-service-ca/service-ca-865cb79987-5mvmh" Apr 21 14:58:00.568327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.568099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q92pt\" (UniqueName: \"kubernetes.io/projected/553528fc-9a9f-434b-a257-73aee39129bf-kube-api-access-q92pt\") pod \"service-ca-865cb79987-5mvmh\" (UID: \"553528fc-9a9f-434b-a257-73aee39129bf\") " pod="openshift-service-ca/service-ca-865cb79987-5mvmh" Apr 21 14:58:00.568774 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.568752 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/553528fc-9a9f-434b-a257-73aee39129bf-signing-cabundle\") pod \"service-ca-865cb79987-5mvmh\" (UID: \"553528fc-9a9f-434b-a257-73aee39129bf\") " pod="openshift-service-ca/service-ca-865cb79987-5mvmh" Apr 21 14:58:00.570583 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.570558 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/553528fc-9a9f-434b-a257-73aee39129bf-signing-key\") pod \"service-ca-865cb79987-5mvmh\" (UID: \"553528fc-9a9f-434b-a257-73aee39129bf\") " pod="openshift-service-ca/service-ca-865cb79987-5mvmh" Apr 21 14:58:00.580055 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.580032 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q92pt\" (UniqueName: \"kubernetes.io/projected/553528fc-9a9f-434b-a257-73aee39129bf-kube-api-access-q92pt\") pod \"service-ca-865cb79987-5mvmh\" (UID: \"553528fc-9a9f-434b-a257-73aee39129bf\") " pod="openshift-service-ca/service-ca-865cb79987-5mvmh" Apr 21 14:58:00.623936 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.623894 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-5mvmh" Apr 21 14:58:00.672785 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.672612 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-vnnr7"] Apr 21 14:58:00.677350 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.677321 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:00.679621 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.679600 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 14:58:00.680024 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.679997 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 14:58:00.680457 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.680440 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 14:58:00.680457 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.680452 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 14:58:00.680620 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.680455 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-s4jk5\"" Apr 21 14:58:00.690396 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.689975 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vnnr7"] Apr 21 14:58:00.753554 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.753524 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5mvmh"] Apr 21 14:58:00.756664 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:58:00.756634 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod553528fc_9a9f_434b_a257_73aee39129bf.slice/crio-1b6bdfe4f78b334a4a59eb703dafc2e74b1d6a89ad2379a9ea8bfd202f562b42 WatchSource:0}: Error finding container 1b6bdfe4f78b334a4a59eb703dafc2e74b1d6a89ad2379a9ea8bfd202f562b42: Status 404 returned error can't find the container with id 1b6bdfe4f78b334a4a59eb703dafc2e74b1d6a89ad2379a9ea8bfd202f562b42 Apr 21 14:58:00.770688 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.770653 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/08e42ac9-1105-4882-9063-9f6e3e6d42a6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:00.770787 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.770741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/08e42ac9-1105-4882-9063-9f6e3e6d42a6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:00.770852 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.770791 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/08e42ac9-1105-4882-9063-9f6e3e6d42a6-crio-socket\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:00.770852 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.770843 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/08e42ac9-1105-4882-9063-9f6e3e6d42a6-data-volume\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:00.770973 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.770959 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gmc6\" (UniqueName: \"kubernetes.io/projected/08e42ac9-1105-4882-9063-9f6e3e6d42a6-kube-api-access-4gmc6\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:00.872389 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.872295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/08e42ac9-1105-4882-9063-9f6e3e6d42a6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:00.872389 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.872351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/08e42ac9-1105-4882-9063-9f6e3e6d42a6-crio-socket\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:00.872576 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.872424 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/08e42ac9-1105-4882-9063-9f6e3e6d42a6-crio-socket\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:00.872576 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.872460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/08e42ac9-1105-4882-9063-9f6e3e6d42a6-data-volume\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:00.872576 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:00.872470 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 14:58:00.872576 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:00.872539 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08e42ac9-1105-4882-9063-9f6e3e6d42a6-insights-runtime-extractor-tls podName:08e42ac9-1105-4882-9063-9f6e3e6d42a6 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:01.372518722 +0000 UTC m=+127.507496315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/08e42ac9-1105-4882-9063-9f6e3e6d42a6-insights-runtime-extractor-tls") pod "insights-runtime-extractor-vnnr7" (UID: "08e42ac9-1105-4882-9063-9f6e3e6d42a6") : secret "insights-runtime-extractor-tls" not found Apr 21 14:58:00.872750 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.872620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gmc6\" (UniqueName: \"kubernetes.io/projected/08e42ac9-1105-4882-9063-9f6e3e6d42a6-kube-api-access-4gmc6\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:00.872750 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.872658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/08e42ac9-1105-4882-9063-9f6e3e6d42a6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:00.873334 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.873319 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/08e42ac9-1105-4882-9063-9f6e3e6d42a6-data-volume\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:00.873509 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.873492 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/08e42ac9-1105-4882-9063-9f6e3e6d42a6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:00.876223 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.876197 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-5mvmh" event={"ID":"553528fc-9a9f-434b-a257-73aee39129bf","Type":"ContainerStarted","Data":"884549b78d9592a5647f55da036169e71e9dfcc53375de5b0310b2a424ebe445"} Apr 21 14:58:00.876308 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.876233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-5mvmh" event={"ID":"553528fc-9a9f-434b-a257-73aee39129bf","Type":"ContainerStarted","Data":"1b6bdfe4f78b334a4a59eb703dafc2e74b1d6a89ad2379a9ea8bfd202f562b42"} Apr 21 14:58:00.891526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.891506 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gmc6\" (UniqueName: \"kubernetes.io/projected/08e42ac9-1105-4882-9063-9f6e3e6d42a6-kube-api-access-4gmc6\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:00.903786 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:00.903742 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-5mvmh" podStartSLOduration=0.903726913 podStartE2EDuration="903.726913ms" podCreationTimestamp="2026-04-21 14:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:58:00.902862078 +0000 UTC m=+127.037839688" watchObservedRunningTime="2026-04-21 14:58:00.903726913 +0000 UTC m=+127.038704523" Apr 21 14:58:01.377488 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:01.377435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/08e42ac9-1105-4882-9063-9f6e3e6d42a6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:01.377895 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:01.377620 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 14:58:01.377895 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:01.377707 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08e42ac9-1105-4882-9063-9f6e3e6d42a6-insights-runtime-extractor-tls podName:08e42ac9-1105-4882-9063-9f6e3e6d42a6 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:02.377690338 +0000 UTC m=+128.512667924 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/08e42ac9-1105-4882-9063-9f6e3e6d42a6-insights-runtime-extractor-tls") pod "insights-runtime-extractor-vnnr7" (UID: "08e42ac9-1105-4882-9063-9f6e3e6d42a6") : secret "insights-runtime-extractor-tls" not found Apr 21 14:58:01.987612 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:01.987565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wvf6s\" (UID: \"85b14099-ff6f-42c5-a9e8-38d1e3153f05\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" Apr 21 14:58:01.987801 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:01.987761 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 14:58:01.987880 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:01.987862 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls podName:85b14099-ff6f-42c5-a9e8-38d1e3153f05 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:09.987837629 +0000 UTC m=+136.122815216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wvf6s" (UID: "85b14099-ff6f-42c5-a9e8-38d1e3153f05") : secret "samples-operator-tls" not found Apr 21 14:58:02.391508 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:02.391472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/08e42ac9-1105-4882-9063-9f6e3e6d42a6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:02.391962 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:02.391647 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 14:58:02.391962 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:02.391746 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08e42ac9-1105-4882-9063-9f6e3e6d42a6-insights-runtime-extractor-tls podName:08e42ac9-1105-4882-9063-9f6e3e6d42a6 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:04.391723329 +0000 UTC m=+130.526700933 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/08e42ac9-1105-4882-9063-9f6e3e6d42a6-insights-runtime-extractor-tls") pod "insights-runtime-extractor-vnnr7" (UID: "08e42ac9-1105-4882-9063-9f6e3e6d42a6") : secret "insights-runtime-extractor-tls" not found Apr 21 14:58:03.298366 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:03.298324 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs\") pod \"network-metrics-daemon-ktgkr\" (UID: \"f531f65c-d73f-48df-b4b9-fffda9589a9e\") " pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:58:03.298562 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:03.298438 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 14:58:03.298562 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:03.298490 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs podName:f531f65c-d73f-48df-b4b9-fffda9589a9e nodeName:}" failed. No retries permitted until 2026-04-21 15:00:05.298476314 +0000 UTC m=+251.433453901 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs") pod "network-metrics-daemon-ktgkr" (UID: "f531f65c-d73f-48df-b4b9-fffda9589a9e") : secret "metrics-daemon-secret" not found Apr 21 14:58:04.407031 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:04.406988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/08e42ac9-1105-4882-9063-9f6e3e6d42a6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:04.407439 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:04.407129 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 14:58:04.407439 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:04.407188 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08e42ac9-1105-4882-9063-9f6e3e6d42a6-insights-runtime-extractor-tls podName:08e42ac9-1105-4882-9063-9f6e3e6d42a6 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:08.407173891 +0000 UTC m=+134.542151478 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/08e42ac9-1105-4882-9063-9f6e3e6d42a6-insights-runtime-extractor-tls") pod "insights-runtime-extractor-vnnr7" (UID: "08e42ac9-1105-4882-9063-9f6e3e6d42a6") : secret "insights-runtime-extractor-tls" not found Apr 21 14:58:08.439089 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:08.439043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/08e42ac9-1105-4882-9063-9f6e3e6d42a6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:08.441632 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:08.441609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/08e42ac9-1105-4882-9063-9f6e3e6d42a6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vnnr7\" (UID: \"08e42ac9-1105-4882-9063-9f6e3e6d42a6\") " pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:08.490438 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:08.490406 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vnnr7" Apr 21 14:58:08.608021 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:08.607997 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vnnr7"] Apr 21 14:58:08.610452 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:58:08.610423 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08e42ac9_1105_4882_9063_9f6e3e6d42a6.slice/crio-5725597f90793fd307b99d2063507e19fb02ae57ab76b57a95f15479ff029304 WatchSource:0}: Error finding container 5725597f90793fd307b99d2063507e19fb02ae57ab76b57a95f15479ff029304: Status 404 returned error can't find the container with id 5725597f90793fd307b99d2063507e19fb02ae57ab76b57a95f15479ff029304 Apr 21 14:58:08.895662 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:08.895626 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vnnr7" event={"ID":"08e42ac9-1105-4882-9063-9f6e3e6d42a6","Type":"ContainerStarted","Data":"89187d7fa8b2852610c72a0c9b62d430bd5778fc28c041381b889c9936223704"} Apr 21 14:58:08.895662 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:08.895662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vnnr7" event={"ID":"08e42ac9-1105-4882-9063-9f6e3e6d42a6","Type":"ContainerStarted","Data":"5725597f90793fd307b99d2063507e19fb02ae57ab76b57a95f15479ff029304"} Apr 21 14:58:09.900501 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:09.900461 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vnnr7" event={"ID":"08e42ac9-1105-4882-9063-9f6e3e6d42a6","Type":"ContainerStarted","Data":"b5377182396a28e46f59a3ac02681b5b5826e85567370f0406e1957d9adc6cbf"} Apr 21 14:58:10.050845 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:10.050809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wvf6s\" (UID: \"85b14099-ff6f-42c5-a9e8-38d1e3153f05\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" Apr 21 14:58:10.053747 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:10.053719 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85b14099-ff6f-42c5-a9e8-38d1e3153f05-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wvf6s\" (UID: \"85b14099-ff6f-42c5-a9e8-38d1e3153f05\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" Apr 21 14:58:10.315023 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:10.314986 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-d9278\"" Apr 21 14:58:10.323632 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:10.323600 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" Apr 21 14:58:10.721983 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:10.721957 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s"] Apr 21 14:58:10.904044 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:10.903960 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" event={"ID":"85b14099-ff6f-42c5-a9e8-38d1e3153f05","Type":"ContainerStarted","Data":"054d28164d479cbed087a9cbbe8d7edccf50e2ec00139b21d17b1aa9f248a47f"} Apr 21 14:58:10.905817 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:10.905793 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vnnr7" event={"ID":"08e42ac9-1105-4882-9063-9f6e3e6d42a6","Type":"ContainerStarted","Data":"6c204be4745b61c4146b09690ef15a976ef40e3934023851beefe8fa6330ea77"} Apr 21 14:58:10.924108 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:10.924065 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-vnnr7" podStartSLOduration=8.946399994 podStartE2EDuration="10.924050913s" podCreationTimestamp="2026-04-21 14:58:00 +0000 UTC" firstStartedPulling="2026-04-21 14:58:08.676506483 +0000 UTC m=+134.811484071" lastFinishedPulling="2026-04-21 14:58:10.654157399 +0000 UTC m=+136.789134990" observedRunningTime="2026-04-21 14:58:10.923229855 +0000 UTC m=+137.058207466" watchObservedRunningTime="2026-04-21 14:58:10.924050913 +0000 UTC m=+137.059028562" Apr 21 14:58:12.916866 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:12.916783 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" event={"ID":"85b14099-ff6f-42c5-a9e8-38d1e3153f05","Type":"ContainerStarted","Data":"5ad930f8264c184d00387cb949d4052118b2e6e67bb25614095b36e5cf99c041"} Apr 21 14:58:12.916866 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:12.916820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" event={"ID":"85b14099-ff6f-42c5-a9e8-38d1e3153f05","Type":"ContainerStarted","Data":"4694351bb12c1f5b2c7fbb3898e248ca5ec5f640dc7630e3867e2bf1b6772ca5"} Apr 21 14:58:12.933414 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:12.933333 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wvf6s" podStartSLOduration=17.143029795 podStartE2EDuration="18.933318922s" podCreationTimestamp="2026-04-21 14:57:54 +0000 UTC" firstStartedPulling="2026-04-21 14:58:10.76190452 +0000 UTC m=+136.896882107" lastFinishedPulling="2026-04-21 14:58:12.552193647 +0000 UTC m=+138.687171234" observedRunningTime="2026-04-21 14:58:12.932456084 +0000 UTC m=+139.067433695" watchObservedRunningTime="2026-04-21 14:58:12.933318922 +0000 UTC m=+139.068296530" Apr 21 14:58:21.454931 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.454892 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-w82ds"] Apr 21 14:58:21.460756 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.460735 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-w82ds" Apr 21 14:58:21.462641 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.462615 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-mfh47\"" Apr 21 14:58:21.470072 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.470049 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-w82ds"] Apr 21 14:58:21.516480 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.516441 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5df9bb698f-sqrff"] Apr 21 14:58:21.516666 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:21.516646 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" podUID="9033c602-82a6-4c53-b7df-ea48a7be061b" Apr 21 14:58:21.548437 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.548401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgpcs\" (UniqueName: \"kubernetes.io/projected/396d7187-2793-45b5-a63d-cc5e84f98ab0-kube-api-access-vgpcs\") pod \"network-check-source-8894fc9bd-w82ds\" (UID: \"396d7187-2793-45b5-a63d-cc5e84f98ab0\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-w82ds" Apr 21 14:58:21.624216 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.624181 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-47zrq"] Apr 21 14:58:21.628134 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.628114 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-47zrq" Apr 21 14:58:21.630553 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.630530 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-rmtpl\"" Apr 21 14:58:21.630658 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.630553 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 14:58:21.639551 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.639527 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-47zrq"] Apr 21 14:58:21.649396 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.649374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgpcs\" (UniqueName: \"kubernetes.io/projected/396d7187-2793-45b5-a63d-cc5e84f98ab0-kube-api-access-vgpcs\") pod \"network-check-source-8894fc9bd-w82ds\" (UID: \"396d7187-2793-45b5-a63d-cc5e84f98ab0\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-w82ds" Apr 21 14:58:21.671693 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.671662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgpcs\" (UniqueName: \"kubernetes.io/projected/396d7187-2793-45b5-a63d-cc5e84f98ab0-kube-api-access-vgpcs\") pod \"network-check-source-8894fc9bd-w82ds\" (UID: \"396d7187-2793-45b5-a63d-cc5e84f98ab0\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-w82ds" Apr 21 14:58:21.750255 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.750211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c9361973-6868-4e2e-bf1e-b99353972328-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-47zrq\" (UID: \"c9361973-6868-4e2e-bf1e-b99353972328\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-47zrq" Apr 21 14:58:21.770251 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.770214 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-w82ds" Apr 21 14:58:21.851218 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.851187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c9361973-6868-4e2e-bf1e-b99353972328-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-47zrq\" (UID: \"c9361973-6868-4e2e-bf1e-b99353972328\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-47zrq" Apr 21 14:58:21.853838 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.853809 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c9361973-6868-4e2e-bf1e-b99353972328-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-47zrq\" (UID: \"c9361973-6868-4e2e-bf1e-b99353972328\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-47zrq" Apr 21 14:58:21.885867 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.885838 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-w82ds"] Apr 21 14:58:21.888916 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:58:21.888882 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod396d7187_2793_45b5_a63d_cc5e84f98ab0.slice/crio-05bbe88b710c7f59a74eab155b66970658fbfd1583709641bb6b8881dd2f18af WatchSource:0}: Error finding container 05bbe88b710c7f59a74eab155b66970658fbfd1583709641bb6b8881dd2f18af: Status 404 returned error can't find the container with id 05bbe88b710c7f59a74eab155b66970658fbfd1583709641bb6b8881dd2f18af Apr 21 14:58:21.937148 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.937122 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-47zrq" Apr 21 14:58:21.941850 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.941829 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:58:21.941850 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.941830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-w82ds" event={"ID":"396d7187-2793-45b5-a63d-cc5e84f98ab0","Type":"ContainerStarted","Data":"05bbe88b710c7f59a74eab155b66970658fbfd1583709641bb6b8881dd2f18af"} Apr 21 14:58:21.946023 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:21.946001 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:58:22.052685 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.052649 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9033c602-82a6-4c53-b7df-ea48a7be061b-ca-trust-extracted\") pod \"9033c602-82a6-4c53-b7df-ea48a7be061b\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " Apr 21 14:58:22.052859 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.052701 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9033c602-82a6-4c53-b7df-ea48a7be061b-trusted-ca\") pod \"9033c602-82a6-4c53-b7df-ea48a7be061b\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " Apr 21 14:58:22.052859 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.052732 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9033c602-82a6-4c53-b7df-ea48a7be061b-image-registry-private-configuration\") pod \"9033c602-82a6-4c53-b7df-ea48a7be061b\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " Apr 21 14:58:22.052859 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.052790 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-certificates\") pod \"9033c602-82a6-4c53-b7df-ea48a7be061b\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " Apr 21 14:58:22.052859 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.052840 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9033c602-82a6-4c53-b7df-ea48a7be061b-installation-pull-secrets\") pod \"9033c602-82a6-4c53-b7df-ea48a7be061b\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " Apr 21 14:58:22.053082 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.052868 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhh4c\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-kube-api-access-jhh4c\") pod \"9033c602-82a6-4c53-b7df-ea48a7be061b\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " Apr 21 14:58:22.053082 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.052895 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-bound-sa-token\") pod \"9033c602-82a6-4c53-b7df-ea48a7be061b\" (UID: \"9033c602-82a6-4c53-b7df-ea48a7be061b\") " Apr 21 14:58:22.053082 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.052895 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9033c602-82a6-4c53-b7df-ea48a7be061b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9033c602-82a6-4c53-b7df-ea48a7be061b" (UID: "9033c602-82a6-4c53-b7df-ea48a7be061b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:58:22.053228 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.053135 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9033c602-82a6-4c53-b7df-ea48a7be061b-ca-trust-extracted\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:58:22.053228 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.053139 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9033c602-82a6-4c53-b7df-ea48a7be061b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9033c602-82a6-4c53-b7df-ea48a7be061b" (UID: "9033c602-82a6-4c53-b7df-ea48a7be061b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:58:22.053228 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.053180 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9033c602-82a6-4c53-b7df-ea48a7be061b" (UID: "9033c602-82a6-4c53-b7df-ea48a7be061b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:58:22.055521 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.055485 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9033c602-82a6-4c53-b7df-ea48a7be061b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9033c602-82a6-4c53-b7df-ea48a7be061b" (UID: "9033c602-82a6-4c53-b7df-ea48a7be061b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:58:22.055646 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.055533 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-kube-api-access-jhh4c" (OuterVolumeSpecName: "kube-api-access-jhh4c") pod "9033c602-82a6-4c53-b7df-ea48a7be061b" (UID: "9033c602-82a6-4c53-b7df-ea48a7be061b"). InnerVolumeSpecName "kube-api-access-jhh4c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:58:22.055646 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.055577 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9033c602-82a6-4c53-b7df-ea48a7be061b" (UID: "9033c602-82a6-4c53-b7df-ea48a7be061b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:58:22.055728 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.055703 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9033c602-82a6-4c53-b7df-ea48a7be061b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "9033c602-82a6-4c53-b7df-ea48a7be061b" (UID: "9033c602-82a6-4c53-b7df-ea48a7be061b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:58:22.070050 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.070025 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-47zrq"] Apr 21 14:58:22.073482 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:58:22.073454 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9361973_6868_4e2e_bf1e_b99353972328.slice/crio-0004f727eb6cfe65c808f6778288ca984759f88f968a8bc867fe64409b5b01f5 WatchSource:0}: Error finding container 0004f727eb6cfe65c808f6778288ca984759f88f968a8bc867fe64409b5b01f5: Status 404 returned error can't find the container with id 0004f727eb6cfe65c808f6778288ca984759f88f968a8bc867fe64409b5b01f5 Apr 21 14:58:22.154569 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.154531 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9033c602-82a6-4c53-b7df-ea48a7be061b-trusted-ca\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:58:22.154569 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.154560 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9033c602-82a6-4c53-b7df-ea48a7be061b-image-registry-private-configuration\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:58:22.154569 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.154571 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-certificates\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:58:22.154821 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.154583 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9033c602-82a6-4c53-b7df-ea48a7be061b-installation-pull-secrets\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:58:22.154821 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.154592 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jhh4c\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-kube-api-access-jhh4c\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:58:22.154821 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.154601 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-bound-sa-token\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:58:22.945941 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.945892 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-47zrq" event={"ID":"c9361973-6868-4e2e-bf1e-b99353972328","Type":"ContainerStarted","Data":"0004f727eb6cfe65c808f6778288ca984759f88f968a8bc867fe64409b5b01f5"} Apr 21 14:58:22.947375 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.947339 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-w82ds" event={"ID":"396d7187-2793-45b5-a63d-cc5e84f98ab0","Type":"ContainerStarted","Data":"4b0434cea042f37b5baf829fcaf7beececef35f083b24f5b6b0d0c7bea0aa012"} Apr 21 14:58:22.947504 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.947387 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5df9bb698f-sqrff" Apr 21 14:58:22.965636 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.965589 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-w82ds" podStartSLOduration=1.965575486 podStartE2EDuration="1.965575486s" podCreationTimestamp="2026-04-21 14:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:58:22.964848822 +0000 UTC m=+149.099826444" watchObservedRunningTime="2026-04-21 14:58:22.965575486 +0000 UTC m=+149.100553094" Apr 21 14:58:22.992304 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.992276 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5df9bb698f-sqrff"] Apr 21 14:58:22.996336 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:22.996313 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5df9bb698f-sqrff"] Apr 21 14:58:23.062507 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:23.062472 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9033c602-82a6-4c53-b7df-ea48a7be061b-registry-tls\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:58:23.951398 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:23.951364 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-47zrq" event={"ID":"c9361973-6868-4e2e-bf1e-b99353972328","Type":"ContainerStarted","Data":"c54cdac171b81a79bd834f3913334d9d872d9a5d7d77fa98f307c0fbffa72471"} Apr 21 14:58:23.951763 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:23.951637 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-47zrq" Apr 21 14:58:23.956220 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:23.956197 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-47zrq" Apr 21 14:58:24.026780 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.026736 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-47zrq" podStartSLOduration=2.077382665 podStartE2EDuration="3.026724365s" podCreationTimestamp="2026-04-21 14:58:21 +0000 UTC" firstStartedPulling="2026-04-21 14:58:22.075441268 +0000 UTC m=+148.210418856" lastFinishedPulling="2026-04-21 14:58:23.024782955 +0000 UTC m=+149.159760556" observedRunningTime="2026-04-21 14:58:24.001197949 +0000 UTC m=+150.136175577" watchObservedRunningTime="2026-04-21 14:58:24.026724365 +0000 UTC m=+150.161701974" Apr 21 14:58:24.450207 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.450174 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9033c602-82a6-4c53-b7df-ea48a7be061b" path="/var/lib/kubelet/pods/9033c602-82a6-4c53-b7df-ea48a7be061b/volumes" Apr 21 14:58:24.816840 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.816805 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-wncmm"] Apr 21 14:58:24.820161 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.820146 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" Apr 21 14:58:24.822326 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.822297 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 14:58:24.822462 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.822350 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 14:58:24.822462 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.822358 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 14:58:24.822979 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.822951 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 14:58:24.823090 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.823021 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-gfk6h\"" Apr 21 14:58:24.823146 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.823085 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 14:58:24.827957 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.827938 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-wncmm"] Apr 21 14:58:24.874971 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.874951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b213c96-1aa7-4093-bb3d-e20a524c5c46-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-wncmm\" (UID: \"6b213c96-1aa7-4093-bb3d-e20a524c5c46\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" Apr 21 14:58:24.875087 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.874985 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b213c96-1aa7-4093-bb3d-e20a524c5c46-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-wncmm\" (UID: \"6b213c96-1aa7-4093-bb3d-e20a524c5c46\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" Apr 21 14:58:24.875087 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.875038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6b213c96-1aa7-4093-bb3d-e20a524c5c46-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-wncmm\" (UID: \"6b213c96-1aa7-4093-bb3d-e20a524c5c46\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" Apr 21 14:58:24.875216 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.875100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l5hw\" (UniqueName: \"kubernetes.io/projected/6b213c96-1aa7-4093-bb3d-e20a524c5c46-kube-api-access-2l5hw\") pod \"prometheus-operator-5676c8c784-wncmm\" (UID: \"6b213c96-1aa7-4093-bb3d-e20a524c5c46\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" Apr 21 14:58:24.975855 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.975817 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6b213c96-1aa7-4093-bb3d-e20a524c5c46-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-wncmm\" (UID: \"6b213c96-1aa7-4093-bb3d-e20a524c5c46\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" Apr 21 14:58:24.976317 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.975889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2l5hw\" (UniqueName: \"kubernetes.io/projected/6b213c96-1aa7-4093-bb3d-e20a524c5c46-kube-api-access-2l5hw\") pod \"prometheus-operator-5676c8c784-wncmm\" (UID: \"6b213c96-1aa7-4093-bb3d-e20a524c5c46\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" Apr 21 14:58:24.976317 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.975933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b213c96-1aa7-4093-bb3d-e20a524c5c46-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-wncmm\" (UID: \"6b213c96-1aa7-4093-bb3d-e20a524c5c46\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" Apr 21 14:58:24.976317 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.975959 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b213c96-1aa7-4093-bb3d-e20a524c5c46-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-wncmm\" (UID: \"6b213c96-1aa7-4093-bb3d-e20a524c5c46\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" Apr 21 14:58:24.976317 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:24.976044 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 21 14:58:24.976317 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:24.976130 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b213c96-1aa7-4093-bb3d-e20a524c5c46-prometheus-operator-tls podName:6b213c96-1aa7-4093-bb3d-e20a524c5c46 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:25.476107127 +0000 UTC m=+151.611084719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/6b213c96-1aa7-4093-bb3d-e20a524c5c46-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-wncmm" (UID: "6b213c96-1aa7-4093-bb3d-e20a524c5c46") : secret "prometheus-operator-tls" not found Apr 21 14:58:24.976717 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.976698 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b213c96-1aa7-4093-bb3d-e20a524c5c46-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-wncmm\" (UID: \"6b213c96-1aa7-4093-bb3d-e20a524c5c46\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" Apr 21 14:58:24.978567 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.978542 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6b213c96-1aa7-4093-bb3d-e20a524c5c46-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-wncmm\" (UID: \"6b213c96-1aa7-4093-bb3d-e20a524c5c46\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" Apr 21 14:58:24.984733 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:24.984699 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l5hw\" (UniqueName: \"kubernetes.io/projected/6b213c96-1aa7-4093-bb3d-e20a524c5c46-kube-api-access-2l5hw\") pod \"prometheus-operator-5676c8c784-wncmm\" (UID: \"6b213c96-1aa7-4093-bb3d-e20a524c5c46\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" Apr 21 14:58:25.480053 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:25.480019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b213c96-1aa7-4093-bb3d-e20a524c5c46-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-wncmm\" (UID: \"6b213c96-1aa7-4093-bb3d-e20a524c5c46\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" Apr 21 14:58:25.482806 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:25.482775 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b213c96-1aa7-4093-bb3d-e20a524c5c46-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-wncmm\" (UID: \"6b213c96-1aa7-4093-bb3d-e20a524c5c46\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" Apr 21 14:58:25.729569 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:25.729538 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" Apr 21 14:58:25.847310 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:25.847279 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-wncmm"] Apr 21 14:58:25.850179 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:58:25.850150 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b213c96_1aa7_4093_bb3d_e20a524c5c46.slice/crio-2e01d466049965fa024e4865836f4486a0fa6424f22942abe60cfa61bbef2010 WatchSource:0}: Error finding container 2e01d466049965fa024e4865836f4486a0fa6424f22942abe60cfa61bbef2010: Status 404 returned error can't find the container with id 2e01d466049965fa024e4865836f4486a0fa6424f22942abe60cfa61bbef2010 Apr 21 14:58:25.959669 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:25.959634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" event={"ID":"6b213c96-1aa7-4093-bb3d-e20a524c5c46","Type":"ContainerStarted","Data":"2e01d466049965fa024e4865836f4486a0fa6424f22942abe60cfa61bbef2010"} Apr 21 14:58:27.966420 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:27.966384 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" event={"ID":"6b213c96-1aa7-4093-bb3d-e20a524c5c46","Type":"ContainerStarted","Data":"4e699e16b9500049ce6237ee65b497aa8d02fb9f56095bcde5edf811da4d7cd1"} Apr 21 14:58:27.966420 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:27.966421 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" event={"ID":"6b213c96-1aa7-4093-bb3d-e20a524c5c46","Type":"ContainerStarted","Data":"4e5d6e674158269973b765d436a7df655053c50403e2feb516c81415a4ecd92a"} Apr 21 14:58:27.982621 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:27.982573 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-wncmm" podStartSLOduration=2.54777479 podStartE2EDuration="3.982561428s" podCreationTimestamp="2026-04-21 14:58:24 +0000 UTC" firstStartedPulling="2026-04-21 14:58:25.852051782 +0000 UTC m=+151.987029370" lastFinishedPulling="2026-04-21 14:58:27.286838421 +0000 UTC m=+153.421816008" observedRunningTime="2026-04-21 14:58:27.982037648 +0000 UTC m=+154.117015259" watchObservedRunningTime="2026-04-21 14:58:27.982561428 +0000 UTC m=+154.117539038" Apr 21 14:58:30.187813 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.187774 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8vc5b"] Apr 21 14:58:30.191159 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.191138 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.193905 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.193876 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 14:58:30.194030 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.193944 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 14:58:30.194099 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.194066 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zpq9w"] Apr 21 14:58:30.194516 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.194489 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-4vvhw\"" Apr 21 14:58:30.194635 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.194542 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 14:58:30.197430 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.197411 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.199725 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.199702 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 14:58:30.200230 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.199704 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-v8bhv\"" Apr 21 14:58:30.200230 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.199742 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 14:58:30.200230 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.199804 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 14:58:30.201164 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.201143 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8vc5b"] Apr 21 14:58:30.306164 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:30.306116 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-vbmd9" podUID="1ada8e2a-356e-4899-913a-b055b92852e4" Apr 21 14:58:30.319307 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.319281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b7e187a6-d29a-449a-b0b0-7531acc7f526-node-exporter-accelerators-collector-config\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.319450 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.319339 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b7e187a6-d29a-449a-b0b0-7531acc7f526-node-exporter-textfile\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.319450 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.319379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b7e187a6-d29a-449a-b0b0-7531acc7f526-node-exporter-tls\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.319561 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.319450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dhqh\" (UniqueName: \"kubernetes.io/projected/def25c30-d426-4e73-b651-27e69d1ef2aa-kube-api-access-5dhqh\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.319561 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.319485 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7e187a6-d29a-449a-b0b0-7531acc7f526-metrics-client-ca\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.319561 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.319516 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/def25c30-d426-4e73-b651-27e69d1ef2aa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.319561 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.319551 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/def25c30-d426-4e73-b651-27e69d1ef2aa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.319764 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.319594 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b7e187a6-d29a-449a-b0b0-7531acc7f526-node-exporter-wtmp\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.319764 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.319624 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/def25c30-d426-4e73-b651-27e69d1ef2aa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.319764 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.319666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/def25c30-d426-4e73-b651-27e69d1ef2aa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.319764 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.319703 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2t5q\" (UniqueName: \"kubernetes.io/projected/b7e187a6-d29a-449a-b0b0-7531acc7f526-kube-api-access-l2t5q\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.319764 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.319741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7e187a6-d29a-449a-b0b0-7531acc7f526-sys\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.319920 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.319792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b7e187a6-d29a-449a-b0b0-7531acc7f526-root\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.319920 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.319831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7e187a6-d29a-449a-b0b0-7531acc7f526-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.319920 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.319856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/def25c30-d426-4e73-b651-27e69d1ef2aa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.322405 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:30.322381 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-s7jz7" podUID="2ee3aa35-2266-4471-8170-7e506d7cd358" Apr 21 14:58:30.420221 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.420187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b7e187a6-d29a-449a-b0b0-7531acc7f526-node-exporter-textfile\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.420385 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.420227 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b7e187a6-d29a-449a-b0b0-7531acc7f526-node-exporter-tls\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.420457 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.420386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dhqh\" (UniqueName: \"kubernetes.io/projected/def25c30-d426-4e73-b651-27e69d1ef2aa-kube-api-access-5dhqh\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.420457 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.420444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7e187a6-d29a-449a-b0b0-7531acc7f526-metrics-client-ca\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.420570 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.420478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/def25c30-d426-4e73-b651-27e69d1ef2aa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.420570 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.420511 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/def25c30-d426-4e73-b651-27e69d1ef2aa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.420672 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.420576 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b7e187a6-d29a-449a-b0b0-7531acc7f526-node-exporter-textfile\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.420749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b7e187a6-d29a-449a-b0b0-7531acc7f526-node-exporter-wtmp\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.420808 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/def25c30-d426-4e73-b651-27e69d1ef2aa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.420860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/def25c30-d426-4e73-b651-27e69d1ef2aa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.420905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2t5q\" (UniqueName: \"kubernetes.io/projected/b7e187a6-d29a-449a-b0b0-7531acc7f526-kube-api-access-l2t5q\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.420942 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7e187a6-d29a-449a-b0b0-7531acc7f526-sys\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.420978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b7e187a6-d29a-449a-b0b0-7531acc7f526-root\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.421003 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7e187a6-d29a-449a-b0b0-7531acc7f526-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.421039 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/def25c30-d426-4e73-b651-27e69d1ef2aa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.421042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7e187a6-d29a-449a-b0b0-7531acc7f526-metrics-client-ca\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.421088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b7e187a6-d29a-449a-b0b0-7531acc7f526-node-exporter-accelerators-collector-config\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.421227 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/def25c30-d426-4e73-b651-27e69d1ef2aa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.421430 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b7e187a6-d29a-449a-b0b0-7531acc7f526-node-exporter-wtmp\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.421473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b7e187a6-d29a-449a-b0b0-7531acc7f526-root\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.421504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7e187a6-d29a-449a-b0b0-7531acc7f526-sys\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.421983 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b7e187a6-d29a-449a-b0b0-7531acc7f526-node-exporter-accelerators-collector-config\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.422484 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.422103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/def25c30-d426-4e73-b651-27e69d1ef2aa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.423234 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.422400 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/def25c30-d426-4e73-b651-27e69d1ef2aa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.423824 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.423791 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b7e187a6-d29a-449a-b0b0-7531acc7f526-node-exporter-tls\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.424187 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.424162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/def25c30-d426-4e73-b651-27e69d1ef2aa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.424510 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.424491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/def25c30-d426-4e73-b651-27e69d1ef2aa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.424894 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.424875 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7e187a6-d29a-449a-b0b0-7531acc7f526-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.430104 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.430074 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dhqh\" (UniqueName: \"kubernetes.io/projected/def25c30-d426-4e73-b651-27e69d1ef2aa-kube-api-access-5dhqh\") pod \"kube-state-metrics-69db897b98-8vc5b\" (UID: \"def25c30-d426-4e73-b651-27e69d1ef2aa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.430333 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.430314 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2t5q\" (UniqueName: \"kubernetes.io/projected/b7e187a6-d29a-449a-b0b0-7531acc7f526-kube-api-access-l2t5q\") pod \"node-exporter-zpq9w\" (UID: \"b7e187a6-d29a-449a-b0b0-7531acc7f526\") " pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.464931 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:30.464875 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-ktgkr" podUID="f531f65c-d73f-48df-b4b9-fffda9589a9e" Apr 21 14:58:30.501776 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.501752 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" Apr 21 14:58:30.508558 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.508528 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zpq9w" Apr 21 14:58:30.518853 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:58:30.518825 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7e187a6_d29a_449a_b0b0_7531acc7f526.slice/crio-c8019c0eea7edb21edb7ae0532f2c5267b5bb41d786f17dff4f5693b273d678a WatchSource:0}: Error finding container c8019c0eea7edb21edb7ae0532f2c5267b5bb41d786f17dff4f5693b273d678a: Status 404 returned error can't find the container with id c8019c0eea7edb21edb7ae0532f2c5267b5bb41d786f17dff4f5693b273d678a Apr 21 14:58:30.628686 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.628658 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8vc5b"] Apr 21 14:58:30.631361 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:58:30.631320 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddef25c30_d426_4e73_b651_27e69d1ef2aa.slice/crio-ebff58b419bc43cdc57f2b3f76e5e71b3d655114cc6624c78ff0b1ddca1525f1 WatchSource:0}: Error finding container ebff58b419bc43cdc57f2b3f76e5e71b3d655114cc6624c78ff0b1ddca1525f1: Status 404 returned error can't find the container with id ebff58b419bc43cdc57f2b3f76e5e71b3d655114cc6624c78ff0b1ddca1525f1 Apr 21 14:58:30.974500 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.974462 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" event={"ID":"def25c30-d426-4e73-b651-27e69d1ef2aa","Type":"ContainerStarted","Data":"ebff58b419bc43cdc57f2b3f76e5e71b3d655114cc6624c78ff0b1ddca1525f1"} Apr 21 14:58:30.975769 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.975738 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zpq9w" event={"ID":"b7e187a6-d29a-449a-b0b0-7531acc7f526","Type":"ContainerStarted","Data":"c8019c0eea7edb21edb7ae0532f2c5267b5bb41d786f17dff4f5693b273d678a"} Apr 21 14:58:30.975921 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:30.975771 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vbmd9" Apr 21 14:58:31.287999 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.287684 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:58:31.292758 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.292727 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.295039 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.295015 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 14:58:31.295579 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.295557 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 14:58:31.295644 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.295605 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-69b4n\"" Apr 21 14:58:31.295687 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.295641 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 14:58:31.295739 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.295567 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 14:58:31.295930 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.295914 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 14:58:31.296022 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.295928 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 14:58:31.296263 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.296219 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 14:58:31.296332 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.296262 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 14:58:31.296332 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.296275 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 14:58:31.309778 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.309737 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:58:31.431449 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.431404 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7d85db67-bde4-498a-9a86-f806d6da0188-config-out\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.431449 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.431449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.431743 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.431479 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d85db67-bde4-498a-9a86-f806d6da0188-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.431743 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.431520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l79zg\" (UniqueName: \"kubernetes.io/projected/7d85db67-bde4-498a-9a86-f806d6da0188-kube-api-access-l79zg\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.431743 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.431542 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-web-config\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.431743 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.431566 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.431743 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.431607 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7d85db67-bde4-498a-9a86-f806d6da0188-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.431743 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.431686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.432040 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.431766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d85db67-bde4-498a-9a86-f806d6da0188-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.432040 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.431801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.432040 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.431845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-config-volume\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.432040 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.431874 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7d85db67-bde4-498a-9a86-f806d6da0188-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.432040 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.431909 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.533046 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.532963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7d85db67-bde4-498a-9a86-f806d6da0188-config-out\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.533046 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.533008 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.533046 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.533026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d85db67-bde4-498a-9a86-f806d6da0188-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.533338 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.533056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l79zg\" (UniqueName: \"kubernetes.io/projected/7d85db67-bde4-498a-9a86-f806d6da0188-kube-api-access-l79zg\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.533338 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.533084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-web-config\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.533338 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.533105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.533338 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.533219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7d85db67-bde4-498a-9a86-f806d6da0188-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.533338 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:58:31.533303 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7d85db67-bde4-498a-9a86-f806d6da0188-alertmanager-trusted-ca-bundle podName:7d85db67-bde4-498a-9a86-f806d6da0188 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:32.033279727 +0000 UTC m=+158.168257317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/7d85db67-bde4-498a-9a86-f806d6da0188-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "7d85db67-bde4-498a-9a86-f806d6da0188") : configmap references non-existent config key: ca-bundle.crt Apr 21 14:58:31.533586 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.533359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.533586 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.533415 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d85db67-bde4-498a-9a86-f806d6da0188-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.533586 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.533448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.533586 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.533495 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-config-volume\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.533586 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.533526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7d85db67-bde4-498a-9a86-f806d6da0188-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.533831 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.533564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.536156 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.535710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7d85db67-bde4-498a-9a86-f806d6da0188-config-out\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.537658 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.537050 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7d85db67-bde4-498a-9a86-f806d6da0188-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.537658 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.537293 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.537658 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.537365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.538145 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.537973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.538267 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.538146 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d85db67-bde4-498a-9a86-f806d6da0188-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.538535 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.538459 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7d85db67-bde4-498a-9a86-f806d6da0188-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.538913 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.538889 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.540997 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.540804 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-config-volume\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.541124 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.541096 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-web-config\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.541835 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.541568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.545483 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.545458 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l79zg\" (UniqueName: \"kubernetes.io/projected/7d85db67-bde4-498a-9a86-f806d6da0188-kube-api-access-l79zg\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:31.980412 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.980384 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" event={"ID":"def25c30-d426-4e73-b651-27e69d1ef2aa","Type":"ContainerStarted","Data":"499c7f32c3025ded9028d54a23263a3a558c2275ad4a75183ddbc21a5ac82c1c"} Apr 21 14:58:31.982315 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.982287 2576 generic.go:358] "Generic (PLEG): container finished" podID="b7e187a6-d29a-449a-b0b0-7531acc7f526" containerID="19a903db143eb7ee0a4be661aa457a90df96864564de09252cb78f6c59f083eb" exitCode=0 Apr 21 14:58:31.982443 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:31.982330 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zpq9w" event={"ID":"b7e187a6-d29a-449a-b0b0-7531acc7f526","Type":"ContainerDied","Data":"19a903db143eb7ee0a4be661aa457a90df96864564de09252cb78f6c59f083eb"} Apr 21 14:58:32.037727 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.037566 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d85db67-bde4-498a-9a86-f806d6da0188-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:32.039325 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.039296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d85db67-bde4-498a-9a86-f806d6da0188-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:32.155700 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.155533 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-76449cc8c-qs844"] Apr 21 14:58:32.159859 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.159835 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.161902 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.161885 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 14:58:32.162098 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.161992 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-pjsrj\"" Apr 21 14:58:32.162157 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.162127 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 14:58:32.162218 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.161998 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-7br69rtuk3t9u\"" Apr 21 14:58:32.162302 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.162201 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 14:58:32.162417 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.162406 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 14:58:32.162576 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.162561 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 14:58:32.169650 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.169619 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-76449cc8c-qs844"] Apr 21 14:58:32.211397 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.211362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:58:32.239594 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.239519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.239594 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.239565 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.239841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.239601 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-thanos-querier-tls\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.239841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.239635 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.239841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.239659 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-metrics-client-ca\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.239841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.239696 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc88j\" (UniqueName: \"kubernetes.io/projected/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-kube-api-access-pc88j\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.239841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.239725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-grpc-tls\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.239841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.239808 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.340767 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.340736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pc88j\" (UniqueName: \"kubernetes.io/projected/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-kube-api-access-pc88j\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.341188 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.340777 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-grpc-tls\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.341188 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.340852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.341188 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.341041 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.341188 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.341073 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.341188 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.341118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-thanos-querier-tls\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.341188 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.341160 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.341188 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.341187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-metrics-client-ca\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.342225 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.342174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-metrics-client-ca\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.343805 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.343769 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.344045 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.344012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.344136 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.344105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.344194 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.344147 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-grpc-tls\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.344422 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.344394 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.344502 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.344487 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-secret-thanos-querier-tls\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.349061 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.347546 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:58:32.350013 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:58:32.349986 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d85db67_bde4_498a_9a86_f806d6da0188.slice/crio-fdd6240355ac07d57530ea54113137e7cc3813e3f5e55c99f253e9bb624d1b37 WatchSource:0}: Error finding container fdd6240355ac07d57530ea54113137e7cc3813e3f5e55c99f253e9bb624d1b37: Status 404 returned error can't find the container with id fdd6240355ac07d57530ea54113137e7cc3813e3f5e55c99f253e9bb624d1b37 Apr 21 14:58:32.357151 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.357127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc88j\" (UniqueName: \"kubernetes.io/projected/6c13ea85-b27b-4d7b-95f7-e1b478b57c96-kube-api-access-pc88j\") pod \"thanos-querier-76449cc8c-qs844\" (UID: \"6c13ea85-b27b-4d7b-95f7-e1b478b57c96\") " pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.469530 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.469453 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:32.590674 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.590534 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-76449cc8c-qs844"] Apr 21 14:58:32.593151 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:58:32.593118 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c13ea85_b27b_4d7b_95f7_e1b478b57c96.slice/crio-5973d451d2f8500d832eeacc97e2c2a94256138786f05e609245463806d4ed14 WatchSource:0}: Error finding container 5973d451d2f8500d832eeacc97e2c2a94256138786f05e609245463806d4ed14: Status 404 returned error can't find the container with id 5973d451d2f8500d832eeacc97e2c2a94256138786f05e609245463806d4ed14 Apr 21 14:58:32.987323 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.987286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zpq9w" event={"ID":"b7e187a6-d29a-449a-b0b0-7531acc7f526","Type":"ContainerStarted","Data":"9f977a4273c78c71352e2359a82c6661d5108e9d2ec74a2888e9d7cf314408fa"} Apr 21 14:58:32.987323 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.987327 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zpq9w" event={"ID":"b7e187a6-d29a-449a-b0b0-7531acc7f526","Type":"ContainerStarted","Data":"16c8536f10daabcfae0b651cd322f32e287c3405ac72d2f93823c518c59bd161"} Apr 21 14:58:32.988754 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.988721 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7d85db67-bde4-498a-9a86-f806d6da0188","Type":"ContainerStarted","Data":"fdd6240355ac07d57530ea54113137e7cc3813e3f5e55c99f253e9bb624d1b37"} Apr 21 14:58:32.990734 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.990706 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" event={"ID":"def25c30-d426-4e73-b651-27e69d1ef2aa","Type":"ContainerStarted","Data":"81da32a41914a3d227999f22e41e4bd3f7404a784994ca69fb963fd6052b86a2"} Apr 21 14:58:32.990825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.990739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" event={"ID":"def25c30-d426-4e73-b651-27e69d1ef2aa","Type":"ContainerStarted","Data":"43fd9c5c7c8004148096a9815a15c8f5d21778ce861b916fe7c76229f5c9cb7a"} Apr 21 14:58:32.992131 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:32.992096 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" event={"ID":"6c13ea85-b27b-4d7b-95f7-e1b478b57c96","Type":"ContainerStarted","Data":"5973d451d2f8500d832eeacc97e2c2a94256138786f05e609245463806d4ed14"} Apr 21 14:58:33.006190 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:33.006143 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zpq9w" podStartSLOduration=2.264673594 podStartE2EDuration="3.00612703s" podCreationTimestamp="2026-04-21 14:58:30 +0000 UTC" firstStartedPulling="2026-04-21 14:58:30.520490162 +0000 UTC m=+156.655467748" lastFinishedPulling="2026-04-21 14:58:31.261943596 +0000 UTC m=+157.396921184" observedRunningTime="2026-04-21 14:58:33.005667462 +0000 UTC m=+159.140645071" watchObservedRunningTime="2026-04-21 14:58:33.00612703 +0000 UTC m=+159.141104639" Apr 21 14:58:33.031963 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:33.031853 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-8vc5b" podStartSLOduration=1.763809893 podStartE2EDuration="3.03183383s" podCreationTimestamp="2026-04-21 14:58:30 +0000 UTC" firstStartedPulling="2026-04-21 14:58:30.633231594 +0000 UTC m=+156.768209182" lastFinishedPulling="2026-04-21 14:58:31.901255527 +0000 UTC m=+158.036233119" observedRunningTime="2026-04-21 14:58:33.03072115 +0000 UTC m=+159.165698759" watchObservedRunningTime="2026-04-21 14:58:33.03183383 +0000 UTC m=+159.166811444" Apr 21 14:58:33.995779 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:33.995744 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d85db67-bde4-498a-9a86-f806d6da0188" containerID="97df0f12e79bf4d5b2ac7037e9dc242fed884167af14a8188115e83a532ac34a" exitCode=0 Apr 21 14:58:33.996216 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:33.995839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7d85db67-bde4-498a-9a86-f806d6da0188","Type":"ContainerDied","Data":"97df0f12e79bf4d5b2ac7037e9dc242fed884167af14a8188115e83a532ac34a"} Apr 21 14:58:35.001271 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:35.001206 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" event={"ID":"6c13ea85-b27b-4d7b-95f7-e1b478b57c96","Type":"ContainerStarted","Data":"6124b64acce08cf1dca082bfb2c8b54887cfd8b46b4a1d274e565cadecf86f3e"} Apr 21 14:58:35.001691 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:35.001286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" event={"ID":"6c13ea85-b27b-4d7b-95f7-e1b478b57c96","Type":"ContainerStarted","Data":"df9cd0d34a7a4fa9f1287e5aec7fd6acd3ca239844e6565efe145d7a923e3d05"} Apr 21 14:58:35.001691 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:35.001300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" event={"ID":"6c13ea85-b27b-4d7b-95f7-e1b478b57c96","Type":"ContainerStarted","Data":"e042b6e8aa858d12c7bf880a250d29a4146f4aa84a84e7f6c593b62c94ca9ba9"} Apr 21 14:58:35.168354 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:35.168305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:58:35.173688 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:35.171862 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ada8e2a-356e-4899-913a-b055b92852e4-metrics-tls\") pod \"dns-default-vbmd9\" (UID: \"1ada8e2a-356e-4899-913a-b055b92852e4\") " pod="openshift-dns/dns-default-vbmd9" Apr 21 14:58:35.178486 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:35.178456 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5sqkw\"" Apr 21 14:58:35.187422 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:35.187381 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vbmd9" Apr 21 14:58:35.269489 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:35.269420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert\") pod \"ingress-canary-s7jz7\" (UID: \"2ee3aa35-2266-4471-8170-7e506d7cd358\") " pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:58:35.272437 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:35.272381 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ee3aa35-2266-4471-8170-7e506d7cd358-cert\") pod \"ingress-canary-s7jz7\" (UID: \"2ee3aa35-2266-4471-8170-7e506d7cd358\") " pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:58:35.328494 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:35.328462 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vbmd9"] Apr 21 14:58:35.331494 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:58:35.331456 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ada8e2a_356e_4899_913a_b055b92852e4.slice/crio-8cbd32b0d1a051c4cec746d08ebbb4bbeed9eccfccba61b570a5a69dbfe65757 WatchSource:0}: Error finding container 8cbd32b0d1a051c4cec746d08ebbb4bbeed9eccfccba61b570a5a69dbfe65757: Status 404 returned error can't find the container with id 8cbd32b0d1a051c4cec746d08ebbb4bbeed9eccfccba61b570a5a69dbfe65757 Apr 21 14:58:36.008299 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:36.008225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vbmd9" event={"ID":"1ada8e2a-356e-4899-913a-b055b92852e4","Type":"ContainerStarted","Data":"8cbd32b0d1a051c4cec746d08ebbb4bbeed9eccfccba61b570a5a69dbfe65757"} Apr 21 14:58:36.011504 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:36.011476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7d85db67-bde4-498a-9a86-f806d6da0188","Type":"ContainerStarted","Data":"8a854cc3d0b82107ca3c041731f995b4a06a7d14a18e9e66b61e144b0c83f795"} Apr 21 14:58:36.011609 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:36.011515 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7d85db67-bde4-498a-9a86-f806d6da0188","Type":"ContainerStarted","Data":"8660ac411e6fe0489050d76a5a37f8709d228b1ac4610c7f2e08f59bee67c123"} Apr 21 14:58:36.011609 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:36.011530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7d85db67-bde4-498a-9a86-f806d6da0188","Type":"ContainerStarted","Data":"b4f82ee76cbb46dc9d8465e697aaa53fee3e45e0819b3315e93394bd9119c95c"} Apr 21 14:58:37.018259 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:37.018197 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" event={"ID":"6c13ea85-b27b-4d7b-95f7-e1b478b57c96","Type":"ContainerStarted","Data":"02941a95b55f34cc38c974038e5772d344444f53fb785cc30eb97d3d27e62b92"} Apr 21 14:58:37.018704 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:37.018271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" event={"ID":"6c13ea85-b27b-4d7b-95f7-e1b478b57c96","Type":"ContainerStarted","Data":"36403fbcfa4b8425d1635ec39708477ec92ca9af7c8add7c741d5cf7223fec8f"} Apr 21 14:58:37.018704 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:37.018289 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" event={"ID":"6c13ea85-b27b-4d7b-95f7-e1b478b57c96","Type":"ContainerStarted","Data":"d25b303c6a76b79b3eaaa1cdc941bd8d0e74a582387587c61dd294fd2e425866"} Apr 21 14:58:37.018704 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:37.018400 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:37.022023 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:37.022008 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7d85db67-bde4-498a-9a86-f806d6da0188","Type":"ContainerStarted","Data":"8c88f2e1731905be7de2d81616bc98c2349618285eaf827d4c74e47c961a01ae"} Apr 21 14:58:37.022083 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:37.022030 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7d85db67-bde4-498a-9a86-f806d6da0188","Type":"ContainerStarted","Data":"c59c2e54e31b927f53e3e9c4e55f1c6a54935545e9a1ec172385bc164e4d2a5a"} Apr 21 14:58:37.022083 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:37.022040 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7d85db67-bde4-498a-9a86-f806d6da0188","Type":"ContainerStarted","Data":"ee49fde5e99b5dc8c71508d9f6a85a7dcf7b5cd437cede7877e9e578b34bd31b"} Apr 21 14:58:37.042507 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:37.042465 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" podStartSLOduration=1.368879832 podStartE2EDuration="5.042452255s" podCreationTimestamp="2026-04-21 14:58:32 +0000 UTC" firstStartedPulling="2026-04-21 14:58:32.595126612 +0000 UTC m=+158.730104198" lastFinishedPulling="2026-04-21 14:58:36.268699031 +0000 UTC m=+162.403676621" observedRunningTime="2026-04-21 14:58:37.040526768 +0000 UTC m=+163.175504378" watchObservedRunningTime="2026-04-21 14:58:37.042452255 +0000 UTC m=+163.177429864" Apr 21 14:58:37.069552 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:37.069511 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.665323794 podStartE2EDuration="6.069496889s" podCreationTimestamp="2026-04-21 14:58:31 +0000 UTC" firstStartedPulling="2026-04-21 14:58:32.352374349 +0000 UTC m=+158.487351936" lastFinishedPulling="2026-04-21 14:58:35.756547443 +0000 UTC m=+161.891525031" observedRunningTime="2026-04-21 14:58:37.068440011 +0000 UTC m=+163.203417623" watchObservedRunningTime="2026-04-21 14:58:37.069496889 +0000 UTC m=+163.204474498" Apr 21 14:58:38.027541 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:38.027496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vbmd9" event={"ID":"1ada8e2a-356e-4899-913a-b055b92852e4","Type":"ContainerStarted","Data":"730b71070015291f62c5ea2151c5118ce940180539f3558979fd6ff23731a5a0"} Apr 21 14:58:38.027541 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:38.027543 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vbmd9" event={"ID":"1ada8e2a-356e-4899-913a-b055b92852e4","Type":"ContainerStarted","Data":"1357b23fa62fd89449f83bfec5ed9b933e91590f68012cf5b7f5743cf87ea9e3"} Apr 21 14:58:38.028049 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:38.027595 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vbmd9" Apr 21 14:58:38.043982 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:38.043930 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vbmd9" podStartSLOduration=129.445724787 podStartE2EDuration="2m11.043917194s" podCreationTimestamp="2026-04-21 14:56:27 +0000 UTC" firstStartedPulling="2026-04-21 14:58:35.333740884 +0000 UTC m=+161.468718475" lastFinishedPulling="2026-04-21 14:58:36.931933281 +0000 UTC m=+163.066910882" observedRunningTime="2026-04-21 14:58:38.042882509 +0000 UTC m=+164.177860130" watchObservedRunningTime="2026-04-21 14:58:38.043917194 +0000 UTC m=+164.178894803" Apr 21 14:58:38.999718 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:38.999687 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-x9ttl"] Apr 21 14:58:39.003029 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:39.003009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-x9ttl" Apr 21 14:58:39.005790 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:39.005773 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 14:58:39.006304 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:39.006287 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-82k2n\"" Apr 21 14:58:39.006399 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:39.006384 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 14:58:39.014760 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:39.014735 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-x9ttl"] Apr 21 14:58:39.108395 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:39.108363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx2wl\" (UniqueName: \"kubernetes.io/projected/5510583f-77e8-4c46-ab4c-93c316c34fff-kube-api-access-cx2wl\") pod \"downloads-6bcc868b7-x9ttl\" (UID: \"5510583f-77e8-4c46-ab4c-93c316c34fff\") " pod="openshift-console/downloads-6bcc868b7-x9ttl" Apr 21 14:58:39.209863 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:39.209830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx2wl\" (UniqueName: \"kubernetes.io/projected/5510583f-77e8-4c46-ab4c-93c316c34fff-kube-api-access-cx2wl\") pod \"downloads-6bcc868b7-x9ttl\" (UID: \"5510583f-77e8-4c46-ab4c-93c316c34fff\") " pod="openshift-console/downloads-6bcc868b7-x9ttl" Apr 21 14:58:39.217937 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:39.217916 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx2wl\" (UniqueName: \"kubernetes.io/projected/5510583f-77e8-4c46-ab4c-93c316c34fff-kube-api-access-cx2wl\") pod \"downloads-6bcc868b7-x9ttl\" (UID: \"5510583f-77e8-4c46-ab4c-93c316c34fff\") " pod="openshift-console/downloads-6bcc868b7-x9ttl" Apr 21 14:58:39.312778 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:39.312702 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-x9ttl" Apr 21 14:58:39.430543 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:39.430520 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-x9ttl"] Apr 21 14:58:39.432977 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:58:39.432952 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5510583f_77e8_4c46_ab4c_93c316c34fff.slice/crio-d7947ee6acb6bb7a46b762adbdfd84894f4318f76280e84ec8031bab7ae2b24b WatchSource:0}: Error finding container d7947ee6acb6bb7a46b762adbdfd84894f4318f76280e84ec8031bab7ae2b24b: Status 404 returned error can't find the container with id d7947ee6acb6bb7a46b762adbdfd84894f4318f76280e84ec8031bab7ae2b24b Apr 21 14:58:40.036360 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:40.036316 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-x9ttl" event={"ID":"5510583f-77e8-4c46-ab4c-93c316c34fff","Type":"ContainerStarted","Data":"d7947ee6acb6bb7a46b762adbdfd84894f4318f76280e84ec8031bab7ae2b24b"} Apr 21 14:58:41.445985 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:41.445940 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:58:41.446492 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:41.446048 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 14:58:41.448526 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:41.448500 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7xfsn\"" Apr 21 14:58:41.457198 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:41.457152 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s7jz7" Apr 21 14:58:41.618520 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:41.618485 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s7jz7"] Apr 21 14:58:41.622736 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:58:41.622694 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ee3aa35_2266_4471_8170_7e506d7cd358.slice/crio-16e493eff01fe213ab0a9ca291fcdb02ad986f642539f275804ca3c0e0ce27f0 WatchSource:0}: Error finding container 16e493eff01fe213ab0a9ca291fcdb02ad986f642539f275804ca3c0e0ce27f0: Status 404 returned error can't find the container with id 16e493eff01fe213ab0a9ca291fcdb02ad986f642539f275804ca3c0e0ce27f0 Apr 21 14:58:42.044004 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:42.043966 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s7jz7" event={"ID":"2ee3aa35-2266-4471-8170-7e506d7cd358","Type":"ContainerStarted","Data":"16e493eff01fe213ab0a9ca291fcdb02ad986f642539f275804ca3c0e0ce27f0"} Apr 21 14:58:43.035508 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:43.035429 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-76449cc8c-qs844" Apr 21 14:58:44.052157 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:44.052115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s7jz7" event={"ID":"2ee3aa35-2266-4471-8170-7e506d7cd358","Type":"ContainerStarted","Data":"4ae572e043ef38b89d603264bfb3ee1541e410ef4de7228df6bbff0b41e453cd"} Apr 21 14:58:44.071180 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:44.071124 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-s7jz7" podStartSLOduration=135.440820658 podStartE2EDuration="2m17.071108047s" podCreationTimestamp="2026-04-21 14:56:27 +0000 UTC" firstStartedPulling="2026-04-21 14:58:41.624929998 +0000 UTC m=+167.759907587" lastFinishedPulling="2026-04-21 14:58:43.255217388 +0000 UTC m=+169.390194976" observedRunningTime="2026-04-21 14:58:44.068791573 +0000 UTC m=+170.203769181" watchObservedRunningTime="2026-04-21 14:58:44.071108047 +0000 UTC m=+170.206085688" Apr 21 14:58:48.033082 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.033047 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vbmd9" Apr 21 14:58:48.781220 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.781186 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7898bf5948-9rzqb"] Apr 21 14:58:48.786093 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.786071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.789387 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.789332 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 14:58:48.789527 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.789417 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 14:58:48.790173 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.790148 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 14:58:48.790859 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.790416 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 14:58:48.790859 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.790465 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-ks97j\"" Apr 21 14:58:48.791178 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.791115 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 14:58:48.797371 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.797345 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7898bf5948-9rzqb"] Apr 21 14:58:48.804265 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.804226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-serving-cert\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.804363 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.804312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-oauth-config\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.804363 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.804351 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-oauth-serving-cert\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.804445 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.804408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpp24\" (UniqueName: \"kubernetes.io/projected/d0e3865d-0753-49ed-9577-c3b7936ecbc1-kube-api-access-lpp24\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.804495 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.804460 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-config\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.804572 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.804553 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-service-ca\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.905044 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.905010 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-service-ca\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.905233 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.905073 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-serving-cert\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.905233 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.905112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-oauth-config\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.905233 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.905139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-oauth-serving-cert\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.905233 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.905182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpp24\" (UniqueName: \"kubernetes.io/projected/d0e3865d-0753-49ed-9577-c3b7936ecbc1-kube-api-access-lpp24\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.905233 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.905226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-config\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.905879 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.905842 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-service-ca\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.906013 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.905967 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-oauth-serving-cert\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.906267 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.906224 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-config\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.908079 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.908056 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-serving-cert\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.908215 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.908197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-oauth-config\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:48.913707 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:48.913686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpp24\" (UniqueName: \"kubernetes.io/projected/d0e3865d-0753-49ed-9577-c3b7936ecbc1-kube-api-access-lpp24\") pod \"console-7898bf5948-9rzqb\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:49.100149 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:49.100062 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:58:54.254527 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:54.254492 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vbmd9_1ada8e2a-356e-4899-913a-b055b92852e4/dns/0.log" Apr 21 14:58:54.452305 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:54.452272 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vbmd9_1ada8e2a-356e-4899-913a-b055b92852e4/kube-rbac-proxy/0.log" Apr 21 14:58:54.853132 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:54.853102 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kpdvd_dadbd785-1d07-45b6-868c-c95e20421c54/dns-node-resolver/0.log" Apr 21 14:58:55.446710 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:55.446675 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7898bf5948-9rzqb"] Apr 21 14:58:55.450396 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:58:55.450368 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0e3865d_0753_49ed_9577_c3b7936ecbc1.slice/crio-b954210a793891900ba39e4f1d53bfe95a25b1832ae77af296ea1f0533a992ae WatchSource:0}: Error finding container b954210a793891900ba39e4f1d53bfe95a25b1832ae77af296ea1f0533a992ae: Status 404 returned error can't find the container with id b954210a793891900ba39e4f1d53bfe95a25b1832ae77af296ea1f0533a992ae Apr 21 14:58:55.652422 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:55.652355 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-s7jz7_2ee3aa35-2266-4471-8170-7e506d7cd358/serve-healthcheck-canary/0.log" Apr 21 14:58:56.091172 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:56.091100 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-x9ttl" event={"ID":"5510583f-77e8-4c46-ab4c-93c316c34fff","Type":"ContainerStarted","Data":"c4b43b9b8d50d3d42e80dd1d67d37eb2976ce7e08077f583288ea37a40896ca2"} Apr 21 14:58:56.091777 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:56.091738 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-x9ttl" Apr 21 14:58:56.092617 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:56.092571 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7898bf5948-9rzqb" event={"ID":"d0e3865d-0753-49ed-9577-c3b7936ecbc1","Type":"ContainerStarted","Data":"b954210a793891900ba39e4f1d53bfe95a25b1832ae77af296ea1f0533a992ae"} Apr 21 14:58:56.103501 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:56.103464 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-x9ttl" Apr 21 14:58:56.107805 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:56.107736 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-x9ttl" podStartSLOduration=2.119077872 podStartE2EDuration="18.107721284s" podCreationTimestamp="2026-04-21 14:58:38 +0000 UTC" firstStartedPulling="2026-04-21 14:58:39.434769513 +0000 UTC m=+165.569747100" lastFinishedPulling="2026-04-21 14:58:55.423412911 +0000 UTC m=+181.558390512" observedRunningTime="2026-04-21 14:58:56.106991742 +0000 UTC m=+182.241969364" watchObservedRunningTime="2026-04-21 14:58:56.107721284 +0000 UTC m=+182.242698893" Apr 21 14:58:57.736587 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.736547 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-769769754-gmnhc"] Apr 21 14:58:57.773989 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.773948 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-769769754-gmnhc"] Apr 21 14:58:57.774411 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.774364 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.788451 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.787791 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 14:58:57.890614 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.890574 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-service-ca\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.890828 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.890640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-console-config\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.890828 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.890696 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-oauth-serving-cert\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.890828 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.890747 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59f571e2-89c7-48cf-9966-440cf750b2cc-console-oauth-config\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.890965 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.890856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmpt2\" (UniqueName: \"kubernetes.io/projected/59f571e2-89c7-48cf-9966-440cf750b2cc-kube-api-access-pmpt2\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.890965 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.890896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-trusted-ca-bundle\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.890965 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.890920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59f571e2-89c7-48cf-9966-440cf750b2cc-console-serving-cert\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.991711 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.991619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-service-ca\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.991711 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.991683 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-console-config\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.991947 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.991718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-oauth-serving-cert\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.991947 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.991756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59f571e2-89c7-48cf-9966-440cf750b2cc-console-oauth-config\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.991947 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.991823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmpt2\" (UniqueName: \"kubernetes.io/projected/59f571e2-89c7-48cf-9966-440cf750b2cc-kube-api-access-pmpt2\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.991947 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.991859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-trusted-ca-bundle\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.991947 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.991888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59f571e2-89c7-48cf-9966-440cf750b2cc-console-serving-cert\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.993717 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.993686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-trusted-ca-bundle\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.993836 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.993802 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-oauth-serving-cert\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.994318 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.994276 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-service-ca\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.994836 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.994810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-console-config\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.995310 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.995286 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59f571e2-89c7-48cf-9966-440cf750b2cc-console-serving-cert\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:57.995706 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:57.995688 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59f571e2-89c7-48cf-9966-440cf750b2cc-console-oauth-config\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:58.005741 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:58.005682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmpt2\" (UniqueName: \"kubernetes.io/projected/59f571e2-89c7-48cf-9966-440cf750b2cc-kube-api-access-pmpt2\") pod \"console-769769754-gmnhc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:58.088992 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:58.088951 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-769769754-gmnhc" Apr 21 14:58:58.990503 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:58.986430 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-769769754-gmnhc"] Apr 21 14:58:59.104061 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:58:59.103982 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-769769754-gmnhc" event={"ID":"59f571e2-89c7-48cf-9966-440cf750b2cc","Type":"ContainerStarted","Data":"2b7c524d2e59c28c4e4309352c12d51251ab9dca5b01a487472b2c46d1ba6d0f"} Apr 21 14:59:00.109364 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:00.109322 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-769769754-gmnhc" event={"ID":"59f571e2-89c7-48cf-9966-440cf750b2cc","Type":"ContainerStarted","Data":"ec835f84bb873045618c389e99c8055d7cad4f7f14da8a79ccfccc9cf8894fb2"} Apr 21 14:59:00.110898 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:00.110867 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7898bf5948-9rzqb" event={"ID":"d0e3865d-0753-49ed-9577-c3b7936ecbc1","Type":"ContainerStarted","Data":"e436ed20922644142a3f1d7ae691e7af6ed06b1f5585381cd46f44b77d847b7f"} Apr 21 14:59:00.136928 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:00.136873 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-769769754-gmnhc" podStartSLOduration=2.601781798 podStartE2EDuration="3.1368573s" podCreationTimestamp="2026-04-21 14:58:57 +0000 UTC" firstStartedPulling="2026-04-21 14:58:58.992856801 +0000 UTC m=+185.127834403" lastFinishedPulling="2026-04-21 14:58:59.527932317 +0000 UTC m=+185.662909905" observedRunningTime="2026-04-21 14:59:00.135076996 +0000 UTC m=+186.270054640" watchObservedRunningTime="2026-04-21 14:59:00.1368573 +0000 UTC m=+186.271834910" Apr 21 14:59:00.156533 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:00.156481 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7898bf5948-9rzqb" podStartSLOduration=8.357888828 podStartE2EDuration="12.156460692s" podCreationTimestamp="2026-04-21 14:58:48 +0000 UTC" firstStartedPulling="2026-04-21 14:58:55.45236233 +0000 UTC m=+181.587339918" lastFinishedPulling="2026-04-21 14:58:59.250934178 +0000 UTC m=+185.385911782" observedRunningTime="2026-04-21 14:59:00.15462959 +0000 UTC m=+186.289607209" watchObservedRunningTime="2026-04-21 14:59:00.156460692 +0000 UTC m=+186.291438302" Apr 21 14:59:08.089907 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:08.089871 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-769769754-gmnhc" Apr 21 14:59:08.089907 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:08.089920 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-769769754-gmnhc" Apr 21 14:59:08.095163 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:08.095139 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-769769754-gmnhc" Apr 21 14:59:08.142664 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:08.142633 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-769769754-gmnhc" Apr 21 14:59:08.197196 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:08.197151 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7898bf5948-9rzqb"] Apr 21 14:59:09.100625 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:09.100594 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:59:23.182190 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:23.182154 2576 generic.go:358] "Generic (PLEG): container finished" podID="fba080c2-51f9-47d4-8e01-bccc93ada876" containerID="79d434590d649ab904cdd8e86fb80622fde74536d3730d06e175fca05ba892cd" exitCode=0 Apr 21 14:59:23.182657 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:23.182231 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" event={"ID":"fba080c2-51f9-47d4-8e01-bccc93ada876","Type":"ContainerDied","Data":"79d434590d649ab904cdd8e86fb80622fde74536d3730d06e175fca05ba892cd"} Apr 21 14:59:23.182657 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:23.182568 2576 scope.go:117] "RemoveContainer" containerID="79d434590d649ab904cdd8e86fb80622fde74536d3730d06e175fca05ba892cd" Apr 21 14:59:23.183620 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:23.183600 2576 generic.go:358] "Generic (PLEG): container finished" podID="9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c" containerID="b042d0db1e8687e6c67a422c1d2db988805e2decc4a1a3d7a104dd5bc127de65" exitCode=0 Apr 21 14:59:23.183680 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:23.183639 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" event={"ID":"9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c","Type":"ContainerDied","Data":"b042d0db1e8687e6c67a422c1d2db988805e2decc4a1a3d7a104dd5bc127de65"} Apr 21 14:59:23.183926 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:23.183914 2576 scope.go:117] "RemoveContainer" containerID="b042d0db1e8687e6c67a422c1d2db988805e2decc4a1a3d7a104dd5bc127de65" Apr 21 14:59:24.187788 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:24.187752 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t6zgc" event={"ID":"9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c","Type":"ContainerStarted","Data":"415d2063b870110c12a9730329c45724973e1f6c661ba8ff1eff6a2d5742cb5b"} Apr 21 14:59:24.189269 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:24.189228 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-l956p" event={"ID":"fba080c2-51f9-47d4-8e01-bccc93ada876","Type":"ContainerStarted","Data":"41f305665c3c99d75fae9e7a66abbd19479e1c84494ce9671d63472cce0ef9c0"} Apr 21 14:59:33.218287 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.218224 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7898bf5948-9rzqb" podUID="d0e3865d-0753-49ed-9577-c3b7936ecbc1" containerName="console" containerID="cri-o://e436ed20922644142a3f1d7ae691e7af6ed06b1f5585381cd46f44b77d847b7f" gracePeriod=15 Apr 21 14:59:33.492294 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.492270 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7898bf5948-9rzqb_d0e3865d-0753-49ed-9577-c3b7936ecbc1/console/0.log" Apr 21 14:59:33.492399 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.492329 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:59:33.512635 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.512609 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-serving-cert\") pod \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " Apr 21 14:59:33.512768 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.512641 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-config\") pod \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " Apr 21 14:59:33.512768 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.512664 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpp24\" (UniqueName: \"kubernetes.io/projected/d0e3865d-0753-49ed-9577-c3b7936ecbc1-kube-api-access-lpp24\") pod \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " Apr 21 14:59:33.512768 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.512686 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-service-ca\") pod \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " Apr 21 14:59:33.512768 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.512728 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-oauth-config\") pod \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " Apr 21 14:59:33.512768 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.512746 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-oauth-serving-cert\") pod \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\" (UID: \"d0e3865d-0753-49ed-9577-c3b7936ecbc1\") " Apr 21 14:59:33.513080 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.513053 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-config" (OuterVolumeSpecName: "console-config") pod "d0e3865d-0753-49ed-9577-c3b7936ecbc1" (UID: "d0e3865d-0753-49ed-9577-c3b7936ecbc1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:59:33.513150 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.513124 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-service-ca" (OuterVolumeSpecName: "service-ca") pod "d0e3865d-0753-49ed-9577-c3b7936ecbc1" (UID: "d0e3865d-0753-49ed-9577-c3b7936ecbc1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:59:33.513221 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.513193 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d0e3865d-0753-49ed-9577-c3b7936ecbc1" (UID: "d0e3865d-0753-49ed-9577-c3b7936ecbc1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:59:33.515613 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.515537 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d0e3865d-0753-49ed-9577-c3b7936ecbc1" (UID: "d0e3865d-0753-49ed-9577-c3b7936ecbc1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:33.515729 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.515698 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e3865d-0753-49ed-9577-c3b7936ecbc1-kube-api-access-lpp24" (OuterVolumeSpecName: "kube-api-access-lpp24") pod "d0e3865d-0753-49ed-9577-c3b7936ecbc1" (UID: "d0e3865d-0753-49ed-9577-c3b7936ecbc1"). InnerVolumeSpecName "kube-api-access-lpp24". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:59:33.515824 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.515793 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d0e3865d-0753-49ed-9577-c3b7936ecbc1" (UID: "d0e3865d-0753-49ed-9577-c3b7936ecbc1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:33.614103 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.614070 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-service-ca\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:33.614103 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.614098 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-oauth-config\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:33.614103 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.614108 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-oauth-serving-cert\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:33.614341 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.614118 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-serving-cert\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:33.614341 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.614127 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0e3865d-0753-49ed-9577-c3b7936ecbc1-console-config\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:33.614341 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:33.614136 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lpp24\" (UniqueName: \"kubernetes.io/projected/d0e3865d-0753-49ed-9577-c3b7936ecbc1-kube-api-access-lpp24\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:34.221205 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:34.221177 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7898bf5948-9rzqb_d0e3865d-0753-49ed-9577-c3b7936ecbc1/console/0.log" Apr 21 14:59:34.221590 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:34.221219 2576 generic.go:358] "Generic (PLEG): container finished" podID="d0e3865d-0753-49ed-9577-c3b7936ecbc1" containerID="e436ed20922644142a3f1d7ae691e7af6ed06b1f5585381cd46f44b77d847b7f" exitCode=2 Apr 21 14:59:34.221590 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:34.221299 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7898bf5948-9rzqb" Apr 21 14:59:34.221590 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:34.221308 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7898bf5948-9rzqb" event={"ID":"d0e3865d-0753-49ed-9577-c3b7936ecbc1","Type":"ContainerDied","Data":"e436ed20922644142a3f1d7ae691e7af6ed06b1f5585381cd46f44b77d847b7f"} Apr 21 14:59:34.221590 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:34.221342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7898bf5948-9rzqb" event={"ID":"d0e3865d-0753-49ed-9577-c3b7936ecbc1","Type":"ContainerDied","Data":"b954210a793891900ba39e4f1d53bfe95a25b1832ae77af296ea1f0533a992ae"} Apr 21 14:59:34.221590 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:34.221356 2576 scope.go:117] "RemoveContainer" containerID="e436ed20922644142a3f1d7ae691e7af6ed06b1f5585381cd46f44b77d847b7f" Apr 21 14:59:34.230355 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:34.230337 2576 scope.go:117] "RemoveContainer" containerID="e436ed20922644142a3f1d7ae691e7af6ed06b1f5585381cd46f44b77d847b7f" Apr 21 14:59:34.230595 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:59:34.230576 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e436ed20922644142a3f1d7ae691e7af6ed06b1f5585381cd46f44b77d847b7f\": container with ID starting with e436ed20922644142a3f1d7ae691e7af6ed06b1f5585381cd46f44b77d847b7f not found: ID does not exist" containerID="e436ed20922644142a3f1d7ae691e7af6ed06b1f5585381cd46f44b77d847b7f" Apr 21 14:59:34.230642 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:34.230604 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e436ed20922644142a3f1d7ae691e7af6ed06b1f5585381cd46f44b77d847b7f"} err="failed to get container status \"e436ed20922644142a3f1d7ae691e7af6ed06b1f5585381cd46f44b77d847b7f\": rpc error: code = NotFound desc = could not find container \"e436ed20922644142a3f1d7ae691e7af6ed06b1f5585381cd46f44b77d847b7f\": container with ID starting with e436ed20922644142a3f1d7ae691e7af6ed06b1f5585381cd46f44b77d847b7f not found: ID does not exist" Apr 21 14:59:34.241859 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:34.241834 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7898bf5948-9rzqb"] Apr 21 14:59:34.247894 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:34.247874 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7898bf5948-9rzqb"] Apr 21 14:59:34.450435 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:34.450398 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0e3865d-0753-49ed-9577-c3b7936ecbc1" path="/var/lib/kubelet/pods/d0e3865d-0753-49ed-9577-c3b7936ecbc1/volumes" Apr 21 14:59:50.648986 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:50.648943 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:59:50.649615 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:50.649557 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="alertmanager" containerID="cri-o://b4f82ee76cbb46dc9d8465e697aaa53fee3e45e0819b3315e93394bd9119c95c" gracePeriod=120 Apr 21 14:59:50.649687 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:50.649632 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="kube-rbac-proxy-metric" containerID="cri-o://c59c2e54e31b927f53e3e9c4e55f1c6a54935545e9a1ec172385bc164e4d2a5a" gracePeriod=120 Apr 21 14:59:50.649741 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:50.649649 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="config-reloader" containerID="cri-o://8660ac411e6fe0489050d76a5a37f8709d228b1ac4610c7f2e08f59bee67c123" gracePeriod=120 Apr 21 14:59:50.649741 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:50.649702 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="kube-rbac-proxy" containerID="cri-o://ee49fde5e99b5dc8c71508d9f6a85a7dcf7b5cd437cede7877e9e578b34bd31b" gracePeriod=120 Apr 21 14:59:50.649741 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:50.649649 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="kube-rbac-proxy-web" containerID="cri-o://8a854cc3d0b82107ca3c041731f995b4a06a7d14a18e9e66b61e144b0c83f795" gracePeriod=120 Apr 21 14:59:50.649878 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:50.649726 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="prom-label-proxy" containerID="cri-o://8c88f2e1731905be7de2d81616bc98c2349618285eaf827d4c74e47c961a01ae" gracePeriod=120 Apr 21 14:59:51.272474 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.272439 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d85db67-bde4-498a-9a86-f806d6da0188" containerID="8c88f2e1731905be7de2d81616bc98c2349618285eaf827d4c74e47c961a01ae" exitCode=0 Apr 21 14:59:51.272474 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.272465 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d85db67-bde4-498a-9a86-f806d6da0188" containerID="c59c2e54e31b927f53e3e9c4e55f1c6a54935545e9a1ec172385bc164e4d2a5a" exitCode=0 Apr 21 14:59:51.272474 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.272472 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d85db67-bde4-498a-9a86-f806d6da0188" containerID="ee49fde5e99b5dc8c71508d9f6a85a7dcf7b5cd437cede7877e9e578b34bd31b" exitCode=0 Apr 21 14:59:51.272474 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.272477 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d85db67-bde4-498a-9a86-f806d6da0188" containerID="8660ac411e6fe0489050d76a5a37f8709d228b1ac4610c7f2e08f59bee67c123" exitCode=0 Apr 21 14:59:51.272474 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.272483 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d85db67-bde4-498a-9a86-f806d6da0188" containerID="b4f82ee76cbb46dc9d8465e697aaa53fee3e45e0819b3315e93394bd9119c95c" exitCode=0 Apr 21 14:59:51.272757 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.272509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7d85db67-bde4-498a-9a86-f806d6da0188","Type":"ContainerDied","Data":"8c88f2e1731905be7de2d81616bc98c2349618285eaf827d4c74e47c961a01ae"} Apr 21 14:59:51.272757 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.272544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7d85db67-bde4-498a-9a86-f806d6da0188","Type":"ContainerDied","Data":"c59c2e54e31b927f53e3e9c4e55f1c6a54935545e9a1ec172385bc164e4d2a5a"} Apr 21 14:59:51.272757 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.272554 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7d85db67-bde4-498a-9a86-f806d6da0188","Type":"ContainerDied","Data":"ee49fde5e99b5dc8c71508d9f6a85a7dcf7b5cd437cede7877e9e578b34bd31b"} Apr 21 14:59:51.272757 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.272563 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7d85db67-bde4-498a-9a86-f806d6da0188","Type":"ContainerDied","Data":"8660ac411e6fe0489050d76a5a37f8709d228b1ac4610c7f2e08f59bee67c123"} Apr 21 14:59:51.272757 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.272573 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7d85db67-bde4-498a-9a86-f806d6da0188","Type":"ContainerDied","Data":"b4f82ee76cbb46dc9d8465e697aaa53fee3e45e0819b3315e93394bd9119c95c"} Apr 21 14:59:51.885068 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.885046 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:51.976795 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.976695 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l79zg\" (UniqueName: \"kubernetes.io/projected/7d85db67-bde4-498a-9a86-f806d6da0188-kube-api-access-l79zg\") pod \"7d85db67-bde4-498a-9a86-f806d6da0188\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " Apr 21 14:59:51.976795 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.976742 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy\") pod \"7d85db67-bde4-498a-9a86-f806d6da0188\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " Apr 21 14:59:51.976795 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.976783 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-main-tls\") pod \"7d85db67-bde4-498a-9a86-f806d6da0188\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " Apr 21 14:59:51.977074 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.976818 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7d85db67-bde4-498a-9a86-f806d6da0188-config-out\") pod \"7d85db67-bde4-498a-9a86-f806d6da0188\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " Apr 21 14:59:51.977074 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.976861 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d85db67-bde4-498a-9a86-f806d6da0188-alertmanager-trusted-ca-bundle\") pod \"7d85db67-bde4-498a-9a86-f806d6da0188\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " Apr 21 14:59:51.977074 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.976883 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-cluster-tls-config\") pod \"7d85db67-bde4-498a-9a86-f806d6da0188\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " Apr 21 14:59:51.977074 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.976907 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy-metric\") pod \"7d85db67-bde4-498a-9a86-f806d6da0188\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " Apr 21 14:59:51.977074 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.976934 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7d85db67-bde4-498a-9a86-f806d6da0188-tls-assets\") pod \"7d85db67-bde4-498a-9a86-f806d6da0188\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " Apr 21 14:59:51.977074 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.976959 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7d85db67-bde4-498a-9a86-f806d6da0188-alertmanager-main-db\") pod \"7d85db67-bde4-498a-9a86-f806d6da0188\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " Apr 21 14:59:51.977074 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.976993 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy-web\") pod \"7d85db67-bde4-498a-9a86-f806d6da0188\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " Apr 21 14:59:51.977074 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.977035 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-web-config\") pod \"7d85db67-bde4-498a-9a86-f806d6da0188\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " Apr 21 14:59:51.977472 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.977079 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-config-volume\") pod \"7d85db67-bde4-498a-9a86-f806d6da0188\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " Apr 21 14:59:51.977472 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.977111 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d85db67-bde4-498a-9a86-f806d6da0188-metrics-client-ca\") pod \"7d85db67-bde4-498a-9a86-f806d6da0188\" (UID: \"7d85db67-bde4-498a-9a86-f806d6da0188\") " Apr 21 14:59:51.977472 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.977307 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d85db67-bde4-498a-9a86-f806d6da0188-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "7d85db67-bde4-498a-9a86-f806d6da0188" (UID: "7d85db67-bde4-498a-9a86-f806d6da0188"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:59:51.977472 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.977448 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d85db67-bde4-498a-9a86-f806d6da0188-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:51.977737 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.977716 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d85db67-bde4-498a-9a86-f806d6da0188-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "7d85db67-bde4-498a-9a86-f806d6da0188" (UID: "7d85db67-bde4-498a-9a86-f806d6da0188"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 14:59:51.978338 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.978310 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d85db67-bde4-498a-9a86-f806d6da0188-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "7d85db67-bde4-498a-9a86-f806d6da0188" (UID: "7d85db67-bde4-498a-9a86-f806d6da0188"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:59:51.980024 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.979985 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "7d85db67-bde4-498a-9a86-f806d6da0188" (UID: "7d85db67-bde4-498a-9a86-f806d6da0188"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:51.980515 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.980423 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "7d85db67-bde4-498a-9a86-f806d6da0188" (UID: "7d85db67-bde4-498a-9a86-f806d6da0188"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:51.980515 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.980435 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "7d85db67-bde4-498a-9a86-f806d6da0188" (UID: "7d85db67-bde4-498a-9a86-f806d6da0188"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:51.980734 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.980573 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "7d85db67-bde4-498a-9a86-f806d6da0188" (UID: "7d85db67-bde4-498a-9a86-f806d6da0188"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:51.980734 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.980697 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d85db67-bde4-498a-9a86-f806d6da0188-config-out" (OuterVolumeSpecName: "config-out") pod "7d85db67-bde4-498a-9a86-f806d6da0188" (UID: "7d85db67-bde4-498a-9a86-f806d6da0188"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 14:59:51.980972 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.980947 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d85db67-bde4-498a-9a86-f806d6da0188-kube-api-access-l79zg" (OuterVolumeSpecName: "kube-api-access-l79zg") pod "7d85db67-bde4-498a-9a86-f806d6da0188" (UID: "7d85db67-bde4-498a-9a86-f806d6da0188"). InnerVolumeSpecName "kube-api-access-l79zg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:59:51.981419 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.981392 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-config-volume" (OuterVolumeSpecName: "config-volume") pod "7d85db67-bde4-498a-9a86-f806d6da0188" (UID: "7d85db67-bde4-498a-9a86-f806d6da0188"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:51.982547 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.982525 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d85db67-bde4-498a-9a86-f806d6da0188-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "7d85db67-bde4-498a-9a86-f806d6da0188" (UID: "7d85db67-bde4-498a-9a86-f806d6da0188"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 14:59:51.985150 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.985127 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "7d85db67-bde4-498a-9a86-f806d6da0188" (UID: "7d85db67-bde4-498a-9a86-f806d6da0188"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:51.991987 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:51.991965 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-web-config" (OuterVolumeSpecName: "web-config") pod "7d85db67-bde4-498a-9a86-f806d6da0188" (UID: "7d85db67-bde4-498a-9a86-f806d6da0188"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 14:59:52.078584 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.078543 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l79zg\" (UniqueName: \"kubernetes.io/projected/7d85db67-bde4-498a-9a86-f806d6da0188-kube-api-access-l79zg\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:52.078584 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.078578 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:52.078584 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.078592 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-main-tls\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:52.078841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.078605 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7d85db67-bde4-498a-9a86-f806d6da0188-config-out\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:52.078841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.078618 2576 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-cluster-tls-config\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:52.078841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.078632 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:52.078841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.078644 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7d85db67-bde4-498a-9a86-f806d6da0188-tls-assets\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:52.078841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.078659 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7d85db67-bde4-498a-9a86-f806d6da0188-alertmanager-main-db\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:52.078841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.078679 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:52.078841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.078692 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-web-config\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:52.078841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.078703 2576 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7d85db67-bde4-498a-9a86-f806d6da0188-config-volume\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:52.078841 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.078714 2576 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d85db67-bde4-498a-9a86-f806d6da0188-metrics-client-ca\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 14:59:52.277908 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.277872 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d85db67-bde4-498a-9a86-f806d6da0188" containerID="8a854cc3d0b82107ca3c041731f995b4a06a7d14a18e9e66b61e144b0c83f795" exitCode=0 Apr 21 14:59:52.278075 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.277960 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7d85db67-bde4-498a-9a86-f806d6da0188","Type":"ContainerDied","Data":"8a854cc3d0b82107ca3c041731f995b4a06a7d14a18e9e66b61e144b0c83f795"} Apr 21 14:59:52.278075 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.277971 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.278075 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.277998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7d85db67-bde4-498a-9a86-f806d6da0188","Type":"ContainerDied","Data":"fdd6240355ac07d57530ea54113137e7cc3813e3f5e55c99f253e9bb624d1b37"} Apr 21 14:59:52.278075 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.278015 2576 scope.go:117] "RemoveContainer" containerID="8c88f2e1731905be7de2d81616bc98c2349618285eaf827d4c74e47c961a01ae" Apr 21 14:59:52.286327 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.286306 2576 scope.go:117] "RemoveContainer" containerID="c59c2e54e31b927f53e3e9c4e55f1c6a54935545e9a1ec172385bc164e4d2a5a" Apr 21 14:59:52.293134 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.293117 2576 scope.go:117] "RemoveContainer" containerID="ee49fde5e99b5dc8c71508d9f6a85a7dcf7b5cd437cede7877e9e578b34bd31b" Apr 21 14:59:52.300562 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.300540 2576 scope.go:117] "RemoveContainer" containerID="8a854cc3d0b82107ca3c041731f995b4a06a7d14a18e9e66b61e144b0c83f795" Apr 21 14:59:52.302359 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.302337 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:59:52.306048 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.306019 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:59:52.307549 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.307537 2576 scope.go:117] "RemoveContainer" containerID="8660ac411e6fe0489050d76a5a37f8709d228b1ac4610c7f2e08f59bee67c123" Apr 21 14:59:52.313837 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.313817 2576 scope.go:117] "RemoveContainer" containerID="b4f82ee76cbb46dc9d8465e697aaa53fee3e45e0819b3315e93394bd9119c95c" Apr 21 14:59:52.320274 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.320234 2576 scope.go:117] "RemoveContainer" containerID="97df0f12e79bf4d5b2ac7037e9dc242fed884167af14a8188115e83a532ac34a" Apr 21 14:59:52.326758 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.326740 2576 scope.go:117] "RemoveContainer" containerID="8c88f2e1731905be7de2d81616bc98c2349618285eaf827d4c74e47c961a01ae" Apr 21 14:59:52.326997 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:59:52.326978 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c88f2e1731905be7de2d81616bc98c2349618285eaf827d4c74e47c961a01ae\": container with ID starting with 8c88f2e1731905be7de2d81616bc98c2349618285eaf827d4c74e47c961a01ae not found: ID does not exist" containerID="8c88f2e1731905be7de2d81616bc98c2349618285eaf827d4c74e47c961a01ae" Apr 21 14:59:52.327052 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.327006 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c88f2e1731905be7de2d81616bc98c2349618285eaf827d4c74e47c961a01ae"} err="failed to get container status \"8c88f2e1731905be7de2d81616bc98c2349618285eaf827d4c74e47c961a01ae\": rpc error: code = NotFound desc = could not find container \"8c88f2e1731905be7de2d81616bc98c2349618285eaf827d4c74e47c961a01ae\": container with ID starting with 8c88f2e1731905be7de2d81616bc98c2349618285eaf827d4c74e47c961a01ae not found: ID does not exist" Apr 21 14:59:52.327052 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.327024 2576 scope.go:117] "RemoveContainer" containerID="c59c2e54e31b927f53e3e9c4e55f1c6a54935545e9a1ec172385bc164e4d2a5a" Apr 21 14:59:52.327295 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:59:52.327271 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c59c2e54e31b927f53e3e9c4e55f1c6a54935545e9a1ec172385bc164e4d2a5a\": container with ID starting with c59c2e54e31b927f53e3e9c4e55f1c6a54935545e9a1ec172385bc164e4d2a5a not found: ID does not exist" containerID="c59c2e54e31b927f53e3e9c4e55f1c6a54935545e9a1ec172385bc164e4d2a5a" Apr 21 14:59:52.327354 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.327302 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c59c2e54e31b927f53e3e9c4e55f1c6a54935545e9a1ec172385bc164e4d2a5a"} err="failed to get container status \"c59c2e54e31b927f53e3e9c4e55f1c6a54935545e9a1ec172385bc164e4d2a5a\": rpc error: code = NotFound desc = could not find container \"c59c2e54e31b927f53e3e9c4e55f1c6a54935545e9a1ec172385bc164e4d2a5a\": container with ID starting with c59c2e54e31b927f53e3e9c4e55f1c6a54935545e9a1ec172385bc164e4d2a5a not found: ID does not exist" Apr 21 14:59:52.327354 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.327318 2576 scope.go:117] "RemoveContainer" containerID="ee49fde5e99b5dc8c71508d9f6a85a7dcf7b5cd437cede7877e9e578b34bd31b" Apr 21 14:59:52.327551 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:59:52.327535 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee49fde5e99b5dc8c71508d9f6a85a7dcf7b5cd437cede7877e9e578b34bd31b\": container with ID starting with ee49fde5e99b5dc8c71508d9f6a85a7dcf7b5cd437cede7877e9e578b34bd31b not found: ID does not exist" containerID="ee49fde5e99b5dc8c71508d9f6a85a7dcf7b5cd437cede7877e9e578b34bd31b" Apr 21 14:59:52.327596 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.327556 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee49fde5e99b5dc8c71508d9f6a85a7dcf7b5cd437cede7877e9e578b34bd31b"} err="failed to get container status \"ee49fde5e99b5dc8c71508d9f6a85a7dcf7b5cd437cede7877e9e578b34bd31b\": rpc error: code = NotFound desc = could not find container \"ee49fde5e99b5dc8c71508d9f6a85a7dcf7b5cd437cede7877e9e578b34bd31b\": container with ID starting with ee49fde5e99b5dc8c71508d9f6a85a7dcf7b5cd437cede7877e9e578b34bd31b not found: ID does not exist" Apr 21 14:59:52.327596 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.327570 2576 scope.go:117] "RemoveContainer" containerID="8a854cc3d0b82107ca3c041731f995b4a06a7d14a18e9e66b61e144b0c83f795" Apr 21 14:59:52.327782 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:59:52.327765 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a854cc3d0b82107ca3c041731f995b4a06a7d14a18e9e66b61e144b0c83f795\": container with ID starting with 8a854cc3d0b82107ca3c041731f995b4a06a7d14a18e9e66b61e144b0c83f795 not found: ID does not exist" containerID="8a854cc3d0b82107ca3c041731f995b4a06a7d14a18e9e66b61e144b0c83f795" Apr 21 14:59:52.327846 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.327790 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a854cc3d0b82107ca3c041731f995b4a06a7d14a18e9e66b61e144b0c83f795"} err="failed to get container status \"8a854cc3d0b82107ca3c041731f995b4a06a7d14a18e9e66b61e144b0c83f795\": rpc error: code = NotFound desc = could not find container \"8a854cc3d0b82107ca3c041731f995b4a06a7d14a18e9e66b61e144b0c83f795\": container with ID starting with 8a854cc3d0b82107ca3c041731f995b4a06a7d14a18e9e66b61e144b0c83f795 not found: ID does not exist" Apr 21 14:59:52.327846 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.327813 2576 scope.go:117] "RemoveContainer" containerID="8660ac411e6fe0489050d76a5a37f8709d228b1ac4610c7f2e08f59bee67c123" Apr 21 14:59:52.328040 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:59:52.328023 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8660ac411e6fe0489050d76a5a37f8709d228b1ac4610c7f2e08f59bee67c123\": container with ID starting with 8660ac411e6fe0489050d76a5a37f8709d228b1ac4610c7f2e08f59bee67c123 not found: ID does not exist" containerID="8660ac411e6fe0489050d76a5a37f8709d228b1ac4610c7f2e08f59bee67c123" Apr 21 14:59:52.328091 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.328043 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8660ac411e6fe0489050d76a5a37f8709d228b1ac4610c7f2e08f59bee67c123"} err="failed to get container status \"8660ac411e6fe0489050d76a5a37f8709d228b1ac4610c7f2e08f59bee67c123\": rpc error: code = NotFound desc = could not find container \"8660ac411e6fe0489050d76a5a37f8709d228b1ac4610c7f2e08f59bee67c123\": container with ID starting with 8660ac411e6fe0489050d76a5a37f8709d228b1ac4610c7f2e08f59bee67c123 not found: ID does not exist" Apr 21 14:59:52.328091 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.328057 2576 scope.go:117] "RemoveContainer" containerID="b4f82ee76cbb46dc9d8465e697aaa53fee3e45e0819b3315e93394bd9119c95c" Apr 21 14:59:52.328298 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:59:52.328278 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4f82ee76cbb46dc9d8465e697aaa53fee3e45e0819b3315e93394bd9119c95c\": container with ID starting with b4f82ee76cbb46dc9d8465e697aaa53fee3e45e0819b3315e93394bd9119c95c not found: ID does not exist" containerID="b4f82ee76cbb46dc9d8465e697aaa53fee3e45e0819b3315e93394bd9119c95c" Apr 21 14:59:52.328345 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.328305 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f82ee76cbb46dc9d8465e697aaa53fee3e45e0819b3315e93394bd9119c95c"} err="failed to get container status \"b4f82ee76cbb46dc9d8465e697aaa53fee3e45e0819b3315e93394bd9119c95c\": rpc error: code = NotFound desc = could not find container \"b4f82ee76cbb46dc9d8465e697aaa53fee3e45e0819b3315e93394bd9119c95c\": container with ID starting with b4f82ee76cbb46dc9d8465e697aaa53fee3e45e0819b3315e93394bd9119c95c not found: ID does not exist" Apr 21 14:59:52.328345 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.328323 2576 scope.go:117] "RemoveContainer" containerID="97df0f12e79bf4d5b2ac7037e9dc242fed884167af14a8188115e83a532ac34a" Apr 21 14:59:52.328559 ip-10-0-130-121 kubenswrapper[2576]: E0421 14:59:52.328534 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97df0f12e79bf4d5b2ac7037e9dc242fed884167af14a8188115e83a532ac34a\": container with ID starting with 97df0f12e79bf4d5b2ac7037e9dc242fed884167af14a8188115e83a532ac34a not found: ID does not exist" containerID="97df0f12e79bf4d5b2ac7037e9dc242fed884167af14a8188115e83a532ac34a" Apr 21 14:59:52.328646 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.328562 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97df0f12e79bf4d5b2ac7037e9dc242fed884167af14a8188115e83a532ac34a"} err="failed to get container status \"97df0f12e79bf4d5b2ac7037e9dc242fed884167af14a8188115e83a532ac34a\": rpc error: code = NotFound desc = could not find container \"97df0f12e79bf4d5b2ac7037e9dc242fed884167af14a8188115e83a532ac34a\": container with ID starting with 97df0f12e79bf4d5b2ac7037e9dc242fed884167af14a8188115e83a532ac34a not found: ID does not exist" Apr 21 14:59:52.336345 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336313 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:59:52.336619 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336608 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="kube-rbac-proxy" Apr 21 14:59:52.336661 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336620 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="kube-rbac-proxy" Apr 21 14:59:52.336661 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336632 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="kube-rbac-proxy-web" Apr 21 14:59:52.336661 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336639 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="kube-rbac-proxy-web" Apr 21 14:59:52.336661 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336650 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="alertmanager" Apr 21 14:59:52.336661 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336655 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="alertmanager" Apr 21 14:59:52.336661 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336663 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="config-reloader" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336668 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="config-reloader" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336673 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="init-config-reloader" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336679 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="init-config-reloader" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336686 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="kube-rbac-proxy-metric" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336691 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="kube-rbac-proxy-metric" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336696 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0e3865d-0753-49ed-9577-c3b7936ecbc1" containerName="console" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336700 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e3865d-0753-49ed-9577-c3b7936ecbc1" containerName="console" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336708 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="prom-label-proxy" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336713 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="prom-label-proxy" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336770 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="kube-rbac-proxy-metric" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336778 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0e3865d-0753-49ed-9577-c3b7936ecbc1" containerName="console" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336783 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="config-reloader" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336789 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="alertmanager" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336795 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="kube-rbac-proxy-web" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336801 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="prom-label-proxy" Apr 21 14:59:52.336825 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.336809 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" containerName="kube-rbac-proxy" Apr 21 14:59:52.341721 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.341693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.343783 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.343762 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 14:59:52.343895 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.343771 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 14:59:52.343895 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.343796 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 14:59:52.343895 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.343796 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 14:59:52.344069 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.343915 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 14:59:52.344069 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.343810 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 14:59:52.344150 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.344089 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-69b4n\"" Apr 21 14:59:52.344186 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.344158 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 14:59:52.344868 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.344855 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 14:59:52.349671 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.349651 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 14:59:52.353204 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.353185 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:59:52.450851 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.450807 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d85db67-bde4-498a-9a86-f806d6da0188" path="/var/lib/kubelet/pods/7d85db67-bde4-498a-9a86-f806d6da0188/volumes" Apr 21 14:59:52.481651 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.481608 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a10a2180-d386-42f7-81e3-1815d058b44b-config-out\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.481651 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.481645 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a10a2180-d386-42f7-81e3-1815d058b44b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.481902 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.481669 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.481902 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.481730 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a10a2180-d386-42f7-81e3-1815d058b44b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.481902 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.481795 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.481902 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.481817 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a10a2180-d386-42f7-81e3-1815d058b44b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.482025 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.481913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.482025 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.481941 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a10a2180-d386-42f7-81e3-1815d058b44b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.482025 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.481959 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.482025 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.481981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.482025 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.482001 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-web-config\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.482025 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.482019 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frvwb\" (UniqueName: \"kubernetes.io/projected/a10a2180-d386-42f7-81e3-1815d058b44b-kube-api-access-frvwb\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.482192 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.482046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-config-volume\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.583075 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.582995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a10a2180-d386-42f7-81e3-1815d058b44b-config-out\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.583075 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.583030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a10a2180-d386-42f7-81e3-1815d058b44b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.583075 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.583050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.583075 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.583065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a10a2180-d386-42f7-81e3-1815d058b44b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.583421 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.583266 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.583421 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.583310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a10a2180-d386-42f7-81e3-1815d058b44b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.583421 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.583381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.583421 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.583409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a10a2180-d386-42f7-81e3-1815d058b44b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.583613 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.583434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.583613 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.583473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.583613 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.583502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-web-config\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.583613 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.583529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frvwb\" (UniqueName: \"kubernetes.io/projected/a10a2180-d386-42f7-81e3-1815d058b44b-kube-api-access-frvwb\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.583613 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.583569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-config-volume\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.583845 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.583677 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a10a2180-d386-42f7-81e3-1815d058b44b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.584271 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.584054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a10a2180-d386-42f7-81e3-1815d058b44b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.584271 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.584213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a10a2180-d386-42f7-81e3-1815d058b44b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.586598 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.586390 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a10a2180-d386-42f7-81e3-1815d058b44b-config-out\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.586598 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.586526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.586598 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.586548 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.586829 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.586601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.586829 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.586660 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-config-volume\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.586829 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.586686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a10a2180-d386-42f7-81e3-1815d058b44b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.586829 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.586811 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.587112 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.587093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-web-config\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.588388 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.588372 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a10a2180-d386-42f7-81e3-1815d058b44b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.594581 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.594565 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frvwb\" (UniqueName: \"kubernetes.io/projected/a10a2180-d386-42f7-81e3-1815d058b44b-kube-api-access-frvwb\") pod \"alertmanager-main-0\" (UID: \"a10a2180-d386-42f7-81e3-1815d058b44b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.652642 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.652617 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 14:59:52.784192 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:52.782063 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 14:59:52.788469 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:59:52.788438 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda10a2180_d386_42f7_81e3_1815d058b44b.slice/crio-02d5f1a2c894e69a11899ce7f6807ba6dc5d67eb79de0d5bb6dde1d62c4ad49e WatchSource:0}: Error finding container 02d5f1a2c894e69a11899ce7f6807ba6dc5d67eb79de0d5bb6dde1d62c4ad49e: Status 404 returned error can't find the container with id 02d5f1a2c894e69a11899ce7f6807ba6dc5d67eb79de0d5bb6dde1d62c4ad49e Apr 21 14:59:53.282885 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:53.282852 2576 generic.go:358] "Generic (PLEG): container finished" podID="a10a2180-d386-42f7-81e3-1815d058b44b" containerID="e6f932ce5eabc22375523600d7c7751fbbd61eda171c8932165fdd5847fda478" exitCode=0 Apr 21 14:59:53.283266 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:53.282902 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a10a2180-d386-42f7-81e3-1815d058b44b","Type":"ContainerDied","Data":"e6f932ce5eabc22375523600d7c7751fbbd61eda171c8932165fdd5847fda478"} Apr 21 14:59:53.283266 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:53.282925 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a10a2180-d386-42f7-81e3-1815d058b44b","Type":"ContainerStarted","Data":"02d5f1a2c894e69a11899ce7f6807ba6dc5d67eb79de0d5bb6dde1d62c4ad49e"} Apr 21 14:59:54.288672 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.288636 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a10a2180-d386-42f7-81e3-1815d058b44b","Type":"ContainerStarted","Data":"071358c61d65ec1c212893cfb5491761e57de0268ea1150c5c40720aa7da1db8"} Apr 21 14:59:54.288672 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.288675 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a10a2180-d386-42f7-81e3-1815d058b44b","Type":"ContainerStarted","Data":"d3baa2ba895d1d226562e0d2ec0668328a642a7a3cbe5ce7b5a75f88a0ed047b"} Apr 21 14:59:54.289064 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.288690 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a10a2180-d386-42f7-81e3-1815d058b44b","Type":"ContainerStarted","Data":"3141036a77cbb31e5003e5849001ee4ff62fb67d11d6a80cbb77c7f9bced8d69"} Apr 21 14:59:54.289064 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.288701 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a10a2180-d386-42f7-81e3-1815d058b44b","Type":"ContainerStarted","Data":"20dacae9ccd4b42fb990be6a413af05778b41a63176da6cc39da399362052fef"} Apr 21 14:59:54.289064 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.288711 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a10a2180-d386-42f7-81e3-1815d058b44b","Type":"ContainerStarted","Data":"cb4aed02ed9e41be248424dd578137e86a4eb763d998ebc835a73804b49635ed"} Apr 21 14:59:54.289064 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.288724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a10a2180-d386-42f7-81e3-1815d058b44b","Type":"ContainerStarted","Data":"349d346c21bf1af454870bbf0d50e6102f09fdcdbed8d95f32f7791c3321b8b1"} Apr 21 14:59:54.321439 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.321293 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.321274206 podStartE2EDuration="2.321274206s" podCreationTimestamp="2026-04-21 14:59:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:59:54.318660599 +0000 UTC m=+240.453638210" watchObservedRunningTime="2026-04-21 14:59:54.321274206 +0000 UTC m=+240.456251822" Apr 21 14:59:54.688654 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.688569 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-594696bdbb-bnj47"] Apr 21 14:59:54.693430 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.693405 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.695696 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.695669 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 14:59:54.696201 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.696181 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 14:59:54.696339 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.696256 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-8f678\"" Apr 21 14:59:54.696339 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.696311 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 14:59:54.696568 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.696551 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 14:59:54.696897 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.696878 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 14:59:54.702571 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.702550 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 14:59:54.708600 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.708572 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-594696bdbb-bnj47"] Apr 21 14:59:54.803324 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.803283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/18f09002-ebf7-4bc8-872a-a3e22b4843f5-secret-telemeter-client\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.803324 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.803330 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/18f09002-ebf7-4bc8-872a-a3e22b4843f5-federate-client-tls\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.803596 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.803352 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/18f09002-ebf7-4bc8-872a-a3e22b4843f5-telemeter-client-tls\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.803596 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.803447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/18f09002-ebf7-4bc8-872a-a3e22b4843f5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.803596 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.803491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18f09002-ebf7-4bc8-872a-a3e22b4843f5-metrics-client-ca\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.803596 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.803538 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f09002-ebf7-4bc8-872a-a3e22b4843f5-serving-certs-ca-bundle\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.803596 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.803567 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx4lm\" (UniqueName: \"kubernetes.io/projected/18f09002-ebf7-4bc8-872a-a3e22b4843f5-kube-api-access-zx4lm\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.803774 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.803600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f09002-ebf7-4bc8-872a-a3e22b4843f5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.904925 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.904886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/18f09002-ebf7-4bc8-872a-a3e22b4843f5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.904925 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.904929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18f09002-ebf7-4bc8-872a-a3e22b4843f5-metrics-client-ca\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.905521 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.904947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f09002-ebf7-4bc8-872a-a3e22b4843f5-serving-certs-ca-bundle\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.905521 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.904966 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zx4lm\" (UniqueName: \"kubernetes.io/projected/18f09002-ebf7-4bc8-872a-a3e22b4843f5-kube-api-access-zx4lm\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.905521 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.904986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f09002-ebf7-4bc8-872a-a3e22b4843f5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.905521 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.905067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/18f09002-ebf7-4bc8-872a-a3e22b4843f5-secret-telemeter-client\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.905521 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.905103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/18f09002-ebf7-4bc8-872a-a3e22b4843f5-federate-client-tls\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.905521 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.905137 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/18f09002-ebf7-4bc8-872a-a3e22b4843f5-telemeter-client-tls\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.905857 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.905827 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f09002-ebf7-4bc8-872a-a3e22b4843f5-serving-certs-ca-bundle\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.905918 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.905773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18f09002-ebf7-4bc8-872a-a3e22b4843f5-metrics-client-ca\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.906259 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.906214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f09002-ebf7-4bc8-872a-a3e22b4843f5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.909602 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.908192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/18f09002-ebf7-4bc8-872a-a3e22b4843f5-telemeter-client-tls\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.909602 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.908720 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/18f09002-ebf7-4bc8-872a-a3e22b4843f5-secret-telemeter-client\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.909756 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.909709 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/18f09002-ebf7-4bc8-872a-a3e22b4843f5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.910156 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.910136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/18f09002-ebf7-4bc8-872a-a3e22b4843f5-federate-client-tls\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:54.913704 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:54.913683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx4lm\" (UniqueName: \"kubernetes.io/projected/18f09002-ebf7-4bc8-872a-a3e22b4843f5-kube-api-access-zx4lm\") pod \"telemeter-client-594696bdbb-bnj47\" (UID: \"18f09002-ebf7-4bc8-872a-a3e22b4843f5\") " pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:55.004694 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:55.004659 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" Apr 21 14:59:55.152886 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:55.152862 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-594696bdbb-bnj47"] Apr 21 14:59:55.155503 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:59:55.155474 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18f09002_ebf7_4bc8_872a_a3e22b4843f5.slice/crio-394e9c7f35b637584fdc72cb853d29f764e323c37d34d3ffee28ba6496faca8a WatchSource:0}: Error finding container 394e9c7f35b637584fdc72cb853d29f764e323c37d34d3ffee28ba6496faca8a: Status 404 returned error can't find the container with id 394e9c7f35b637584fdc72cb853d29f764e323c37d34d3ffee28ba6496faca8a Apr 21 14:59:55.294508 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:55.294416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" event={"ID":"18f09002-ebf7-4bc8-872a-a3e22b4843f5","Type":"ContainerStarted","Data":"394e9c7f35b637584fdc72cb853d29f764e323c37d34d3ffee28ba6496faca8a"} Apr 21 14:59:57.302929 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:57.302881 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" event={"ID":"18f09002-ebf7-4bc8-872a-a3e22b4843f5","Type":"ContainerStarted","Data":"ec41c8760f2dfcfea89d2ba3de507c85a44b5c6b5739c2bfea44a5ff8f81ad53"} Apr 21 14:59:57.302929 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:57.302919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" event={"ID":"18f09002-ebf7-4bc8-872a-a3e22b4843f5","Type":"ContainerStarted","Data":"e7bf54c9a6b63cb942f1b51fe631342283ccdb0793ca221e5a77396b5c62d2b6"} Apr 21 14:59:57.302929 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:57.302930 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" event={"ID":"18f09002-ebf7-4bc8-872a-a3e22b4843f5","Type":"ContainerStarted","Data":"8182754be7dea6d6b8278a48bdb970dfa06ec9155a0edbe74489cd2bb62f344f"} Apr 21 14:59:57.324135 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:57.324036 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-594696bdbb-bnj47" podStartSLOduration=1.421858904 podStartE2EDuration="3.324021458s" podCreationTimestamp="2026-04-21 14:59:54 +0000 UTC" firstStartedPulling="2026-04-21 14:59:55.157444329 +0000 UTC m=+241.292421916" lastFinishedPulling="2026-04-21 14:59:57.059606884 +0000 UTC m=+243.194584470" observedRunningTime="2026-04-21 14:59:57.323396421 +0000 UTC m=+243.458374053" watchObservedRunningTime="2026-04-21 14:59:57.324021458 +0000 UTC m=+243.458999067" Apr 21 14:59:58.266298 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.266261 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6869bcd4d5-sj785"] Apr 21 14:59:58.269704 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.269688 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.279670 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.279643 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6869bcd4d5-sj785"] Apr 21 14:59:58.335991 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.335961 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-oauth-config\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.336343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.336014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-serving-cert\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.336343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.336060 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-config\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.336343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.336078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-trusted-ca-bundle\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.336343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.336102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-service-ca\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.336343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.336128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6rpk\" (UniqueName: \"kubernetes.io/projected/cc7e32ad-994d-4cbb-8450-f60dcb34d931-kube-api-access-b6rpk\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.336343 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.336161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-oauth-serving-cert\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.437466 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.437436 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-config\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.437466 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.437469 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-trusted-ca-bundle\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.437703 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.437495 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-service-ca\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.437703 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.437521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6rpk\" (UniqueName: \"kubernetes.io/projected/cc7e32ad-994d-4cbb-8450-f60dcb34d931-kube-api-access-b6rpk\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.437703 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.437563 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-oauth-serving-cert\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.437878 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.437844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-oauth-config\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.438275 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.438092 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-serving-cert\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.438275 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.438229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-service-ca\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.438444 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.438374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-config\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.438501 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.438479 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-trusted-ca-bundle\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.438562 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.438545 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-oauth-serving-cert\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.440559 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.440534 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-oauth-config\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.440649 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.440543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-serving-cert\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.446124 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.446104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6rpk\" (UniqueName: \"kubernetes.io/projected/cc7e32ad-994d-4cbb-8450-f60dcb34d931-kube-api-access-b6rpk\") pod \"console-6869bcd4d5-sj785\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.579684 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.579598 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 14:59:58.698096 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:58.698072 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6869bcd4d5-sj785"] Apr 21 14:59:58.700687 ip-10-0-130-121 kubenswrapper[2576]: W0421 14:59:58.700655 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc7e32ad_994d_4cbb_8450_f60dcb34d931.slice/crio-bd156db6776f33a6c9e5a87f60fe74deab7f02d6e62bdf20490bacf45a7f2c5a WatchSource:0}: Error finding container bd156db6776f33a6c9e5a87f60fe74deab7f02d6e62bdf20490bacf45a7f2c5a: Status 404 returned error can't find the container with id bd156db6776f33a6c9e5a87f60fe74deab7f02d6e62bdf20490bacf45a7f2c5a Apr 21 14:59:59.310337 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:59.310298 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6869bcd4d5-sj785" event={"ID":"cc7e32ad-994d-4cbb-8450-f60dcb34d931","Type":"ContainerStarted","Data":"addcaa194089744fcefa786d979d4b1dd89ac87a68de5f1357db5f23aa31dd79"} Apr 21 14:59:59.310337 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:59.310335 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6869bcd4d5-sj785" event={"ID":"cc7e32ad-994d-4cbb-8450-f60dcb34d931","Type":"ContainerStarted","Data":"bd156db6776f33a6c9e5a87f60fe74deab7f02d6e62bdf20490bacf45a7f2c5a"} Apr 21 14:59:59.334857 ip-10-0-130-121 kubenswrapper[2576]: I0421 14:59:59.334806 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6869bcd4d5-sj785" podStartSLOduration=1.3347891619999999 podStartE2EDuration="1.334789162s" podCreationTimestamp="2026-04-21 14:59:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:59:59.333633826 +0000 UTC m=+245.468611439" watchObservedRunningTime="2026-04-21 14:59:59.334789162 +0000 UTC m=+245.469766770" Apr 21 15:00:05.399098 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:05.399049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs\") pod \"network-metrics-daemon-ktgkr\" (UID: \"f531f65c-d73f-48df-b4b9-fffda9589a9e\") " pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 15:00:05.401523 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:05.401502 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f531f65c-d73f-48df-b4b9-fffda9589a9e-metrics-certs\") pod \"network-metrics-daemon-ktgkr\" (UID: \"f531f65c-d73f-48df-b4b9-fffda9589a9e\") " pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 15:00:05.448847 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:05.448819 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cb4hj\"" Apr 21 15:00:05.457249 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:05.457224 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktgkr" Apr 21 15:00:05.578742 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:05.578707 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ktgkr"] Apr 21 15:00:05.581702 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:00:05.581662 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf531f65c_d73f_48df_b4b9_fffda9589a9e.slice/crio-d845b4c8e41af4221acf31333e1cdb57705892a9c2db828def67adb4d2e769de WatchSource:0}: Error finding container d845b4c8e41af4221acf31333e1cdb57705892a9c2db828def67adb4d2e769de: Status 404 returned error can't find the container with id d845b4c8e41af4221acf31333e1cdb57705892a9c2db828def67adb4d2e769de Apr 21 15:00:06.334892 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:06.334852 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ktgkr" event={"ID":"f531f65c-d73f-48df-b4b9-fffda9589a9e","Type":"ContainerStarted","Data":"d845b4c8e41af4221acf31333e1cdb57705892a9c2db828def67adb4d2e769de"} Apr 21 15:00:07.339763 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:07.339725 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ktgkr" event={"ID":"f531f65c-d73f-48df-b4b9-fffda9589a9e","Type":"ContainerStarted","Data":"8a23c055cd24eab6183a18bbba39651117934e3badca97e6189ac67665612416"} Apr 21 15:00:07.339763 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:07.339762 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ktgkr" event={"ID":"f531f65c-d73f-48df-b4b9-fffda9589a9e","Type":"ContainerStarted","Data":"df6f7cd3c611ee0ffcabf170e49706cd21ea1108cb20c7616ae146bdd682de94"} Apr 21 15:00:07.356211 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:07.356155 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ktgkr" podStartSLOduration=252.321208267 podStartE2EDuration="4m13.356140175s" podCreationTimestamp="2026-04-21 14:55:54 +0000 UTC" firstStartedPulling="2026-04-21 15:00:05.583679543 +0000 UTC m=+251.718657130" lastFinishedPulling="2026-04-21 15:00:06.618611448 +0000 UTC m=+252.753589038" observedRunningTime="2026-04-21 15:00:07.354143942 +0000 UTC m=+253.489121562" watchObservedRunningTime="2026-04-21 15:00:07.356140175 +0000 UTC m=+253.491117817" Apr 21 15:00:08.580071 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:08.580023 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 15:00:08.580071 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:08.580085 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 15:00:08.584698 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:08.584676 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 15:00:09.349787 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:09.349753 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 15:00:09.397648 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:09.397616 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-769769754-gmnhc"] Apr 21 15:00:34.417111 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.417074 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-769769754-gmnhc" podUID="59f571e2-89c7-48cf-9966-440cf750b2cc" containerName="console" containerID="cri-o://ec835f84bb873045618c389e99c8055d7cad4f7f14da8a79ccfccc9cf8894fb2" gracePeriod=15 Apr 21 15:00:34.674857 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.674834 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-769769754-gmnhc_59f571e2-89c7-48cf-9966-440cf750b2cc/console/0.log" Apr 21 15:00:34.674977 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.674894 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-769769754-gmnhc" Apr 21 15:00:34.740883 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.740844 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-service-ca\") pod \"59f571e2-89c7-48cf-9966-440cf750b2cc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " Apr 21 15:00:34.741040 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.740898 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmpt2\" (UniqueName: \"kubernetes.io/projected/59f571e2-89c7-48cf-9966-440cf750b2cc-kube-api-access-pmpt2\") pod \"59f571e2-89c7-48cf-9966-440cf750b2cc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " Apr 21 15:00:34.741040 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.740953 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-console-config\") pod \"59f571e2-89c7-48cf-9966-440cf750b2cc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " Apr 21 15:00:34.741040 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.741014 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59f571e2-89c7-48cf-9966-440cf750b2cc-console-oauth-config\") pod \"59f571e2-89c7-48cf-9966-440cf750b2cc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " Apr 21 15:00:34.741174 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.741041 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-oauth-serving-cert\") pod \"59f571e2-89c7-48cf-9966-440cf750b2cc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " Apr 21 15:00:34.741174 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.741063 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-trusted-ca-bundle\") pod \"59f571e2-89c7-48cf-9966-440cf750b2cc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " Apr 21 15:00:34.741174 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.741104 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59f571e2-89c7-48cf-9966-440cf750b2cc-console-serving-cert\") pod \"59f571e2-89c7-48cf-9966-440cf750b2cc\" (UID: \"59f571e2-89c7-48cf-9966-440cf750b2cc\") " Apr 21 15:00:34.741370 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.741214 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-service-ca" (OuterVolumeSpecName: "service-ca") pod "59f571e2-89c7-48cf-9966-440cf750b2cc" (UID: "59f571e2-89c7-48cf-9966-440cf750b2cc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:00:34.741463 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.741425 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-console-config" (OuterVolumeSpecName: "console-config") pod "59f571e2-89c7-48cf-9966-440cf750b2cc" (UID: "59f571e2-89c7-48cf-9966-440cf750b2cc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:00:34.741528 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.741472 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "59f571e2-89c7-48cf-9966-440cf750b2cc" (UID: "59f571e2-89c7-48cf-9966-440cf750b2cc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:00:34.741694 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.741671 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "59f571e2-89c7-48cf-9966-440cf750b2cc" (UID: "59f571e2-89c7-48cf-9966-440cf750b2cc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:00:34.743379 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.743350 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f571e2-89c7-48cf-9966-440cf750b2cc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "59f571e2-89c7-48cf-9966-440cf750b2cc" (UID: "59f571e2-89c7-48cf-9966-440cf750b2cc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:00:34.743553 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.743535 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f571e2-89c7-48cf-9966-440cf750b2cc-kube-api-access-pmpt2" (OuterVolumeSpecName: "kube-api-access-pmpt2") pod "59f571e2-89c7-48cf-9966-440cf750b2cc" (UID: "59f571e2-89c7-48cf-9966-440cf750b2cc"). InnerVolumeSpecName "kube-api-access-pmpt2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:00:34.743604 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.743550 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f571e2-89c7-48cf-9966-440cf750b2cc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "59f571e2-89c7-48cf-9966-440cf750b2cc" (UID: "59f571e2-89c7-48cf-9966-440cf750b2cc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:00:34.841911 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.841868 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59f571e2-89c7-48cf-9966-440cf750b2cc-console-oauth-config\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:00:34.841911 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.841903 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-oauth-serving-cert\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:00:34.841911 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.841917 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-trusted-ca-bundle\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:00:34.842139 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.841929 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59f571e2-89c7-48cf-9966-440cf750b2cc-console-serving-cert\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:00:34.842139 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.841942 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-service-ca\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:00:34.842139 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.841955 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pmpt2\" (UniqueName: \"kubernetes.io/projected/59f571e2-89c7-48cf-9966-440cf750b2cc-kube-api-access-pmpt2\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:00:34.842139 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:34.841966 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59f571e2-89c7-48cf-9966-440cf750b2cc-console-config\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:00:35.422726 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:35.422687 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-769769754-gmnhc_59f571e2-89c7-48cf-9966-440cf750b2cc/console/0.log" Apr 21 15:00:35.423180 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:35.422739 2576 generic.go:358] "Generic (PLEG): container finished" podID="59f571e2-89c7-48cf-9966-440cf750b2cc" containerID="ec835f84bb873045618c389e99c8055d7cad4f7f14da8a79ccfccc9cf8894fb2" exitCode=2 Apr 21 15:00:35.423180 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:35.422795 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-769769754-gmnhc" event={"ID":"59f571e2-89c7-48cf-9966-440cf750b2cc","Type":"ContainerDied","Data":"ec835f84bb873045618c389e99c8055d7cad4f7f14da8a79ccfccc9cf8894fb2"} Apr 21 15:00:35.423180 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:35.422818 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-769769754-gmnhc" Apr 21 15:00:35.423180 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:35.422840 2576 scope.go:117] "RemoveContainer" containerID="ec835f84bb873045618c389e99c8055d7cad4f7f14da8a79ccfccc9cf8894fb2" Apr 21 15:00:35.423180 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:35.422827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-769769754-gmnhc" event={"ID":"59f571e2-89c7-48cf-9966-440cf750b2cc","Type":"ContainerDied","Data":"2b7c524d2e59c28c4e4309352c12d51251ab9dca5b01a487472b2c46d1ba6d0f"} Apr 21 15:00:35.431521 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:35.431489 2576 scope.go:117] "RemoveContainer" containerID="ec835f84bb873045618c389e99c8055d7cad4f7f14da8a79ccfccc9cf8894fb2" Apr 21 15:00:35.431792 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:00:35.431774 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec835f84bb873045618c389e99c8055d7cad4f7f14da8a79ccfccc9cf8894fb2\": container with ID starting with ec835f84bb873045618c389e99c8055d7cad4f7f14da8a79ccfccc9cf8894fb2 not found: ID does not exist" containerID="ec835f84bb873045618c389e99c8055d7cad4f7f14da8a79ccfccc9cf8894fb2" Apr 21 15:00:35.431848 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:35.431801 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec835f84bb873045618c389e99c8055d7cad4f7f14da8a79ccfccc9cf8894fb2"} err="failed to get container status \"ec835f84bb873045618c389e99c8055d7cad4f7f14da8a79ccfccc9cf8894fb2\": rpc error: code = NotFound desc = could not find container \"ec835f84bb873045618c389e99c8055d7cad4f7f14da8a79ccfccc9cf8894fb2\": container with ID starting with ec835f84bb873045618c389e99c8055d7cad4f7f14da8a79ccfccc9cf8894fb2 not found: ID does not exist" Apr 21 15:00:35.442578 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:35.442553 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-769769754-gmnhc"] Apr 21 15:00:35.445949 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:35.445928 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-769769754-gmnhc"] Apr 21 15:00:36.451510 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:36.451473 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59f571e2-89c7-48cf-9966-440cf750b2cc" path="/var/lib/kubelet/pods/59f571e2-89c7-48cf-9966-440cf750b2cc/volumes" Apr 21 15:00:54.328000 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:00:54.327972 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 15:01:33.679780 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.679740 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56c8dc547d-jl8h8"] Apr 21 15:01:33.680307 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.680225 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59f571e2-89c7-48cf-9966-440cf750b2cc" containerName="console" Apr 21 15:01:33.680307 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.680268 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f571e2-89c7-48cf-9966-440cf750b2cc" containerName="console" Apr 21 15:01:33.680421 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.680393 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="59f571e2-89c7-48cf-9966-440cf750b2cc" containerName="console" Apr 21 15:01:33.683475 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.683454 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.692783 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.692761 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56c8dc547d-jl8h8"] Apr 21 15:01:33.753299 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.753271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-oauth-serving-cert\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.753415 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.753351 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-config\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.753415 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.753373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwdns\" (UniqueName: \"kubernetes.io/projected/1c059704-159b-4a52-90e0-6e3ea29cb80e-kube-api-access-vwdns\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.753489 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.753426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-serving-cert\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.753523 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.753489 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-trusted-ca-bundle\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.753566 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.753547 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-oauth-config\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.753603 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.753575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-service-ca\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.854006 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.853965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-config\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.854006 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.854005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwdns\" (UniqueName: \"kubernetes.io/projected/1c059704-159b-4a52-90e0-6e3ea29cb80e-kube-api-access-vwdns\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.854356 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.854032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-serving-cert\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.854356 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.854065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-trusted-ca-bundle\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.854356 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.854201 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-oauth-config\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.854356 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.854268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-service-ca\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.854356 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.854327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-oauth-serving-cert\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.854772 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.854748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-config\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.855033 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.855011 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-oauth-serving-cert\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.855094 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.855011 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-trusted-ca-bundle\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.855436 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.855420 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-service-ca\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.857388 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.857366 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-oauth-config\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.857467 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.857369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-serving-cert\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.862012 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.861988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwdns\" (UniqueName: \"kubernetes.io/projected/1c059704-159b-4a52-90e0-6e3ea29cb80e-kube-api-access-vwdns\") pod \"console-56c8dc547d-jl8h8\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:33.993346 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:33.993223 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:34.119533 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:34.119511 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56c8dc547d-jl8h8"] Apr 21 15:01:34.122257 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:01:34.122206 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c059704_159b_4a52_90e0_6e3ea29cb80e.slice/crio-36c7034230e460e47679adac9719e1a6619c0f96a84d7e7465ce1656b46e6ab9 WatchSource:0}: Error finding container 36c7034230e460e47679adac9719e1a6619c0f96a84d7e7465ce1656b46e6ab9: Status 404 returned error can't find the container with id 36c7034230e460e47679adac9719e1a6619c0f96a84d7e7465ce1656b46e6ab9 Apr 21 15:01:34.124039 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:34.124020 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:01:34.591183 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:34.591148 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56c8dc547d-jl8h8" event={"ID":"1c059704-159b-4a52-90e0-6e3ea29cb80e","Type":"ContainerStarted","Data":"ab189d63021a1c13b4e269ca1fcdebd498bcbe54006b2125a65592dad7cce81e"} Apr 21 15:01:34.591183 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:34.591186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56c8dc547d-jl8h8" event={"ID":"1c059704-159b-4a52-90e0-6e3ea29cb80e","Type":"ContainerStarted","Data":"36c7034230e460e47679adac9719e1a6619c0f96a84d7e7465ce1656b46e6ab9"} Apr 21 15:01:34.612823 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:34.612762 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56c8dc547d-jl8h8" podStartSLOduration=1.612745139 podStartE2EDuration="1.612745139s" podCreationTimestamp="2026-04-21 15:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:01:34.611034881 +0000 UTC m=+340.746012491" watchObservedRunningTime="2026-04-21 15:01:34.612745139 +0000 UTC m=+340.747722749" Apr 21 15:01:43.993415 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:43.993304 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:43.993415 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:43.993347 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:43.998935 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:43.998898 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:44.627616 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:44.627590 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:01:44.681197 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:01:44.681168 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6869bcd4d5-sj785"] Apr 21 15:02:09.700851 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:09.700789 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6869bcd4d5-sj785" podUID="cc7e32ad-994d-4cbb-8450-f60dcb34d931" containerName="console" containerID="cri-o://addcaa194089744fcefa786d979d4b1dd89ac87a68de5f1357db5f23aa31dd79" gracePeriod=15 Apr 21 15:02:09.945609 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:09.945584 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6869bcd4d5-sj785_cc7e32ad-994d-4cbb-8450-f60dcb34d931/console/0.log" Apr 21 15:02:09.945738 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:09.945650 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 15:02:10.068714 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.068670 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-trusted-ca-bundle\") pod \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " Apr 21 15:02:10.068714 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.068718 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6rpk\" (UniqueName: \"kubernetes.io/projected/cc7e32ad-994d-4cbb-8450-f60dcb34d931-kube-api-access-b6rpk\") pod \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " Apr 21 15:02:10.068972 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.068742 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-oauth-config\") pod \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " Apr 21 15:02:10.068972 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.068792 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-service-ca\") pod \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " Apr 21 15:02:10.068972 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.068821 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-oauth-serving-cert\") pod \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " Apr 21 15:02:10.068972 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.068852 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-config\") pod \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " Apr 21 15:02:10.068972 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.068876 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-serving-cert\") pod \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\" (UID: \"cc7e32ad-994d-4cbb-8450-f60dcb34d931\") " Apr 21 15:02:10.069327 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.069296 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cc7e32ad-994d-4cbb-8450-f60dcb34d931" (UID: "cc7e32ad-994d-4cbb-8450-f60dcb34d931"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:02:10.069440 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.069410 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-config" (OuterVolumeSpecName: "console-config") pod "cc7e32ad-994d-4cbb-8450-f60dcb34d931" (UID: "cc7e32ad-994d-4cbb-8450-f60dcb34d931"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:02:10.069499 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.069435 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cc7e32ad-994d-4cbb-8450-f60dcb34d931" (UID: "cc7e32ad-994d-4cbb-8450-f60dcb34d931"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:02:10.069499 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.069453 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-service-ca" (OuterVolumeSpecName: "service-ca") pod "cc7e32ad-994d-4cbb-8450-f60dcb34d931" (UID: "cc7e32ad-994d-4cbb-8450-f60dcb34d931"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:02:10.071314 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.071281 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cc7e32ad-994d-4cbb-8450-f60dcb34d931" (UID: "cc7e32ad-994d-4cbb-8450-f60dcb34d931"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:02:10.071433 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.071330 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cc7e32ad-994d-4cbb-8450-f60dcb34d931" (UID: "cc7e32ad-994d-4cbb-8450-f60dcb34d931"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:02:10.071433 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.071358 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc7e32ad-994d-4cbb-8450-f60dcb34d931-kube-api-access-b6rpk" (OuterVolumeSpecName: "kube-api-access-b6rpk") pod "cc7e32ad-994d-4cbb-8450-f60dcb34d931" (UID: "cc7e32ad-994d-4cbb-8450-f60dcb34d931"). InnerVolumeSpecName "kube-api-access-b6rpk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:02:10.169862 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.169813 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-service-ca\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:02:10.169862 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.169857 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-oauth-serving-cert\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:02:10.169862 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.169867 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-config\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:02:10.169862 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.169876 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-serving-cert\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:02:10.170098 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.169887 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc7e32ad-994d-4cbb-8450-f60dcb34d931-trusted-ca-bundle\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:02:10.170098 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.169897 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b6rpk\" (UniqueName: \"kubernetes.io/projected/cc7e32ad-994d-4cbb-8450-f60dcb34d931-kube-api-access-b6rpk\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:02:10.170098 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.169907 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc7e32ad-994d-4cbb-8450-f60dcb34d931-console-oauth-config\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:02:10.702534 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.702509 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6869bcd4d5-sj785_cc7e32ad-994d-4cbb-8450-f60dcb34d931/console/0.log" Apr 21 15:02:10.702933 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.702551 2576 generic.go:358] "Generic (PLEG): container finished" podID="cc7e32ad-994d-4cbb-8450-f60dcb34d931" containerID="addcaa194089744fcefa786d979d4b1dd89ac87a68de5f1357db5f23aa31dd79" exitCode=2 Apr 21 15:02:10.702933 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.702617 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6869bcd4d5-sj785" Apr 21 15:02:10.702933 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.702616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6869bcd4d5-sj785" event={"ID":"cc7e32ad-994d-4cbb-8450-f60dcb34d931","Type":"ContainerDied","Data":"addcaa194089744fcefa786d979d4b1dd89ac87a68de5f1357db5f23aa31dd79"} Apr 21 15:02:10.702933 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.702723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6869bcd4d5-sj785" event={"ID":"cc7e32ad-994d-4cbb-8450-f60dcb34d931","Type":"ContainerDied","Data":"bd156db6776f33a6c9e5a87f60fe74deab7f02d6e62bdf20490bacf45a7f2c5a"} Apr 21 15:02:10.702933 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.702741 2576 scope.go:117] "RemoveContainer" containerID="addcaa194089744fcefa786d979d4b1dd89ac87a68de5f1357db5f23aa31dd79" Apr 21 15:02:10.710743 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.710717 2576 scope.go:117] "RemoveContainer" containerID="addcaa194089744fcefa786d979d4b1dd89ac87a68de5f1357db5f23aa31dd79" Apr 21 15:02:10.710983 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:02:10.710964 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"addcaa194089744fcefa786d979d4b1dd89ac87a68de5f1357db5f23aa31dd79\": container with ID starting with addcaa194089744fcefa786d979d4b1dd89ac87a68de5f1357db5f23aa31dd79 not found: ID does not exist" containerID="addcaa194089744fcefa786d979d4b1dd89ac87a68de5f1357db5f23aa31dd79" Apr 21 15:02:10.711043 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.710994 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"addcaa194089744fcefa786d979d4b1dd89ac87a68de5f1357db5f23aa31dd79"} err="failed to get container status \"addcaa194089744fcefa786d979d4b1dd89ac87a68de5f1357db5f23aa31dd79\": rpc error: code = NotFound desc = could not find container \"addcaa194089744fcefa786d979d4b1dd89ac87a68de5f1357db5f23aa31dd79\": container with ID starting with addcaa194089744fcefa786d979d4b1dd89ac87a68de5f1357db5f23aa31dd79 not found: ID does not exist" Apr 21 15:02:10.717730 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.717706 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6869bcd4d5-sj785"] Apr 21 15:02:10.723124 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:10.723102 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6869bcd4d5-sj785"] Apr 21 15:02:12.450707 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:12.450654 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc7e32ad-994d-4cbb-8450-f60dcb34d931" path="/var/lib/kubelet/pods/cc7e32ad-994d-4cbb-8450-f60dcb34d931/volumes" Apr 21 15:02:40.246128 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.246091 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-q59kh"] Apr 21 15:02:40.247439 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.247411 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc7e32ad-994d-4cbb-8450-f60dcb34d931" containerName="console" Apr 21 15:02:40.247620 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.247593 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc7e32ad-994d-4cbb-8450-f60dcb34d931" containerName="console" Apr 21 15:02:40.247813 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.247798 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc7e32ad-994d-4cbb-8450-f60dcb34d931" containerName="console" Apr 21 15:02:40.251879 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.251859 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-q59kh" Apr 21 15:02:40.253883 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.253860 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 21 15:02:40.253983 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.253886 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:02:40.253983 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.253903 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-swfsb\"" Apr 21 15:02:40.261513 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.261490 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-q59kh"] Apr 21 15:02:40.424326 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.424278 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkl6v\" (UniqueName: \"kubernetes.io/projected/19cce84d-f315-41d6-8953-975b932b3f66-kube-api-access-tkl6v\") pod \"cert-manager-operator-controller-manager-54b9655956-q59kh\" (UID: \"19cce84d-f315-41d6-8953-975b932b3f66\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-q59kh" Apr 21 15:02:40.424527 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.424356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/19cce84d-f315-41d6-8953-975b932b3f66-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-q59kh\" (UID: \"19cce84d-f315-41d6-8953-975b932b3f66\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-q59kh" Apr 21 15:02:40.525758 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.525722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/19cce84d-f315-41d6-8953-975b932b3f66-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-q59kh\" (UID: \"19cce84d-f315-41d6-8953-975b932b3f66\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-q59kh" Apr 21 15:02:40.525949 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.525804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkl6v\" (UniqueName: \"kubernetes.io/projected/19cce84d-f315-41d6-8953-975b932b3f66-kube-api-access-tkl6v\") pod \"cert-manager-operator-controller-manager-54b9655956-q59kh\" (UID: \"19cce84d-f315-41d6-8953-975b932b3f66\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-q59kh" Apr 21 15:02:40.526151 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.526125 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/19cce84d-f315-41d6-8953-975b932b3f66-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-q59kh\" (UID: \"19cce84d-f315-41d6-8953-975b932b3f66\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-q59kh" Apr 21 15:02:40.538376 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.538344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkl6v\" (UniqueName: \"kubernetes.io/projected/19cce84d-f315-41d6-8953-975b932b3f66-kube-api-access-tkl6v\") pod \"cert-manager-operator-controller-manager-54b9655956-q59kh\" (UID: \"19cce84d-f315-41d6-8953-975b932b3f66\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-q59kh" Apr 21 15:02:40.562267 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.562234 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-q59kh" Apr 21 15:02:40.689876 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.689834 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-q59kh"] Apr 21 15:02:40.692605 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:02:40.692579 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19cce84d_f315_41d6_8953_975b932b3f66.slice/crio-8d425117dfdecb68c415526b4e757e7ee95f862c7ebdf3aff2dafc360a23ae86 WatchSource:0}: Error finding container 8d425117dfdecb68c415526b4e757e7ee95f862c7ebdf3aff2dafc360a23ae86: Status 404 returned error can't find the container with id 8d425117dfdecb68c415526b4e757e7ee95f862c7ebdf3aff2dafc360a23ae86 Apr 21 15:02:40.793363 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:40.793279 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-q59kh" event={"ID":"19cce84d-f315-41d6-8953-975b932b3f66","Type":"ContainerStarted","Data":"8d425117dfdecb68c415526b4e757e7ee95f862c7ebdf3aff2dafc360a23ae86"} Apr 21 15:02:44.808337 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:44.808299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-q59kh" event={"ID":"19cce84d-f315-41d6-8953-975b932b3f66","Type":"ContainerStarted","Data":"18da84155b727bfcd031f69b5fa7e9dd6147fd4cf9e7a3dff076230aafee6fc0"} Apr 21 15:02:44.852463 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:44.852377 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-q59kh" podStartSLOduration=1.291743305 podStartE2EDuration="4.852360984s" podCreationTimestamp="2026-04-21 15:02:40 +0000 UTC" firstStartedPulling="2026-04-21 15:02:40.695318083 +0000 UTC m=+406.830295685" lastFinishedPulling="2026-04-21 15:02:44.255935777 +0000 UTC m=+410.390913364" observedRunningTime="2026-04-21 15:02:44.851969108 +0000 UTC m=+410.986946717" watchObservedRunningTime="2026-04-21 15:02:44.852360984 +0000 UTC m=+410.987338593" Apr 21 15:02:47.047536 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:47.047497 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-rn4h9"] Apr 21 15:02:47.051061 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:47.051042 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-rn4h9" Apr 21 15:02:47.053263 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:47.053222 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 15:02:47.053808 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:47.053786 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-z64jh\"" Apr 21 15:02:47.054484 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:47.054466 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 15:02:47.065397 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:47.065372 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-rn4h9"] Apr 21 15:02:47.081903 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:47.081874 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/524b14d3-5ea3-464a-8c66-cbbda09e7083-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-rn4h9\" (UID: \"524b14d3-5ea3-464a-8c66-cbbda09e7083\") " pod="cert-manager/cert-manager-webhook-587ccfb98-rn4h9" Apr 21 15:02:47.082040 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:47.081934 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czkcx\" (UniqueName: \"kubernetes.io/projected/524b14d3-5ea3-464a-8c66-cbbda09e7083-kube-api-access-czkcx\") pod \"cert-manager-webhook-587ccfb98-rn4h9\" (UID: \"524b14d3-5ea3-464a-8c66-cbbda09e7083\") " pod="cert-manager/cert-manager-webhook-587ccfb98-rn4h9" Apr 21 15:02:47.182902 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:47.182861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czkcx\" (UniqueName: \"kubernetes.io/projected/524b14d3-5ea3-464a-8c66-cbbda09e7083-kube-api-access-czkcx\") pod \"cert-manager-webhook-587ccfb98-rn4h9\" (UID: \"524b14d3-5ea3-464a-8c66-cbbda09e7083\") " pod="cert-manager/cert-manager-webhook-587ccfb98-rn4h9" Apr 21 15:02:47.183090 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:47.182939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/524b14d3-5ea3-464a-8c66-cbbda09e7083-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-rn4h9\" (UID: \"524b14d3-5ea3-464a-8c66-cbbda09e7083\") " pod="cert-manager/cert-manager-webhook-587ccfb98-rn4h9" Apr 21 15:02:47.191517 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:47.191479 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/524b14d3-5ea3-464a-8c66-cbbda09e7083-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-rn4h9\" (UID: \"524b14d3-5ea3-464a-8c66-cbbda09e7083\") " pod="cert-manager/cert-manager-webhook-587ccfb98-rn4h9" Apr 21 15:02:47.191642 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:47.191489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czkcx\" (UniqueName: \"kubernetes.io/projected/524b14d3-5ea3-464a-8c66-cbbda09e7083-kube-api-access-czkcx\") pod \"cert-manager-webhook-587ccfb98-rn4h9\" (UID: \"524b14d3-5ea3-464a-8c66-cbbda09e7083\") " pod="cert-manager/cert-manager-webhook-587ccfb98-rn4h9" Apr 21 15:02:47.375796 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:47.375681 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-rn4h9" Apr 21 15:02:47.497358 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:47.497324 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-rn4h9"] Apr 21 15:02:47.500296 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:02:47.500264 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod524b14d3_5ea3_464a_8c66_cbbda09e7083.slice/crio-3bec1e5acf8664e1459c2e3cf8cf9f50219e6b8e6ce4bcee894e15a6f4af63af WatchSource:0}: Error finding container 3bec1e5acf8664e1459c2e3cf8cf9f50219e6b8e6ce4bcee894e15a6f4af63af: Status 404 returned error can't find the container with id 3bec1e5acf8664e1459c2e3cf8cf9f50219e6b8e6ce4bcee894e15a6f4af63af Apr 21 15:02:47.819291 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:47.819260 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-rn4h9" event={"ID":"524b14d3-5ea3-464a-8c66-cbbda09e7083","Type":"ContainerStarted","Data":"3bec1e5acf8664e1459c2e3cf8cf9f50219e6b8e6ce4bcee894e15a6f4af63af"} Apr 21 15:02:51.835686 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:51.835641 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-rn4h9" event={"ID":"524b14d3-5ea3-464a-8c66-cbbda09e7083","Type":"ContainerStarted","Data":"3553f4efe6b58fd10032082ee97faf08ae0c8f86add77fc4e3e6c6a7758650c4"} Apr 21 15:02:51.836090 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:51.835757 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-rn4h9" Apr 21 15:02:51.852990 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:51.852940 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-rn4h9" podStartSLOduration=0.60772557 podStartE2EDuration="4.852925058s" podCreationTimestamp="2026-04-21 15:02:47 +0000 UTC" firstStartedPulling="2026-04-21 15:02:47.504637227 +0000 UTC m=+413.639614813" lastFinishedPulling="2026-04-21 15:02:51.749836712 +0000 UTC m=+417.884814301" observedRunningTime="2026-04-21 15:02:51.851574345 +0000 UTC m=+417.986551965" watchObservedRunningTime="2026-04-21 15:02:51.852925058 +0000 UTC m=+417.987902667" Apr 21 15:02:52.303574 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:52.303543 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-tcfgs"] Apr 21 15:02:52.306939 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:52.306922 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-tcfgs" Apr 21 15:02:52.308781 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:52.308757 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-ftcb7\"" Apr 21 15:02:52.313640 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:52.313605 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-tcfgs"] Apr 21 15:02:52.326642 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:52.326583 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27qdz\" (UniqueName: \"kubernetes.io/projected/a4b3d0ca-223a-41ec-9bdf-8e5a6afb4f59-kube-api-access-27qdz\") pod \"cert-manager-cainjector-68b757865b-tcfgs\" (UID: \"a4b3d0ca-223a-41ec-9bdf-8e5a6afb4f59\") " pod="cert-manager/cert-manager-cainjector-68b757865b-tcfgs" Apr 21 15:02:52.326642 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:52.326631 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4b3d0ca-223a-41ec-9bdf-8e5a6afb4f59-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-tcfgs\" (UID: \"a4b3d0ca-223a-41ec-9bdf-8e5a6afb4f59\") " pod="cert-manager/cert-manager-cainjector-68b757865b-tcfgs" Apr 21 15:02:52.427992 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:52.427962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27qdz\" (UniqueName: \"kubernetes.io/projected/a4b3d0ca-223a-41ec-9bdf-8e5a6afb4f59-kube-api-access-27qdz\") pod \"cert-manager-cainjector-68b757865b-tcfgs\" (UID: \"a4b3d0ca-223a-41ec-9bdf-8e5a6afb4f59\") " pod="cert-manager/cert-manager-cainjector-68b757865b-tcfgs" Apr 21 15:02:52.427992 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:52.427995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4b3d0ca-223a-41ec-9bdf-8e5a6afb4f59-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-tcfgs\" (UID: \"a4b3d0ca-223a-41ec-9bdf-8e5a6afb4f59\") " pod="cert-manager/cert-manager-cainjector-68b757865b-tcfgs" Apr 21 15:02:52.435449 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:52.435421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27qdz\" (UniqueName: \"kubernetes.io/projected/a4b3d0ca-223a-41ec-9bdf-8e5a6afb4f59-kube-api-access-27qdz\") pod \"cert-manager-cainjector-68b757865b-tcfgs\" (UID: \"a4b3d0ca-223a-41ec-9bdf-8e5a6afb4f59\") " pod="cert-manager/cert-manager-cainjector-68b757865b-tcfgs" Apr 21 15:02:52.435542 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:52.435465 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4b3d0ca-223a-41ec-9bdf-8e5a6afb4f59-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-tcfgs\" (UID: \"a4b3d0ca-223a-41ec-9bdf-8e5a6afb4f59\") " pod="cert-manager/cert-manager-cainjector-68b757865b-tcfgs" Apr 21 15:02:52.617411 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:52.617327 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-tcfgs" Apr 21 15:02:52.738045 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:52.738014 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-tcfgs"] Apr 21 15:02:52.741385 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:02:52.741358 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4b3d0ca_223a_41ec_9bdf_8e5a6afb4f59.slice/crio-e02fb5a7924da6fbd2a134a6ffb9beb09da85a04ad823752d5d3d40bb9f45ec9 WatchSource:0}: Error finding container e02fb5a7924da6fbd2a134a6ffb9beb09da85a04ad823752d5d3d40bb9f45ec9: Status 404 returned error can't find the container with id e02fb5a7924da6fbd2a134a6ffb9beb09da85a04ad823752d5d3d40bb9f45ec9 Apr 21 15:02:52.840271 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:52.840218 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-tcfgs" event={"ID":"a4b3d0ca-223a-41ec-9bdf-8e5a6afb4f59","Type":"ContainerStarted","Data":"1cbd16718383054f013cbb1779c92f6071138ad0e6c108768765176ba8399748"} Apr 21 15:02:52.840611 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:52.840280 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-tcfgs" event={"ID":"a4b3d0ca-223a-41ec-9bdf-8e5a6afb4f59","Type":"ContainerStarted","Data":"e02fb5a7924da6fbd2a134a6ffb9beb09da85a04ad823752d5d3d40bb9f45ec9"} Apr 21 15:02:52.854481 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:52.854432 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-tcfgs" podStartSLOduration=0.854415506 podStartE2EDuration="854.415506ms" podCreationTimestamp="2026-04-21 15:02:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:02:52.853077578 +0000 UTC m=+418.988055187" watchObservedRunningTime="2026-04-21 15:02:52.854415506 +0000 UTC m=+418.989393116" Apr 21 15:02:57.843042 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:02:57.843010 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-rn4h9" Apr 21 15:03:05.440823 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:05.440784 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-lhc2d"] Apr 21 15:03:05.445478 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:05.445456 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-lhc2d" Apr 21 15:03:05.447377 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:05.447357 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-sfcgz\"" Apr 21 15:03:05.451219 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:05.451198 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-lhc2d"] Apr 21 15:03:05.541640 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:05.541608 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5z82\" (UniqueName: \"kubernetes.io/projected/47b4c1fd-4418-4d96-aeb6-6b126d6f293b-kube-api-access-l5z82\") pod \"cert-manager-79c8d999ff-lhc2d\" (UID: \"47b4c1fd-4418-4d96-aeb6-6b126d6f293b\") " pod="cert-manager/cert-manager-79c8d999ff-lhc2d" Apr 21 15:03:05.541802 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:05.541747 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47b4c1fd-4418-4d96-aeb6-6b126d6f293b-bound-sa-token\") pod \"cert-manager-79c8d999ff-lhc2d\" (UID: \"47b4c1fd-4418-4d96-aeb6-6b126d6f293b\") " pod="cert-manager/cert-manager-79c8d999ff-lhc2d" Apr 21 15:03:05.642646 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:05.642605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47b4c1fd-4418-4d96-aeb6-6b126d6f293b-bound-sa-token\") pod \"cert-manager-79c8d999ff-lhc2d\" (UID: \"47b4c1fd-4418-4d96-aeb6-6b126d6f293b\") " pod="cert-manager/cert-manager-79c8d999ff-lhc2d" Apr 21 15:03:05.642843 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:05.642679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5z82\" (UniqueName: \"kubernetes.io/projected/47b4c1fd-4418-4d96-aeb6-6b126d6f293b-kube-api-access-l5z82\") pod \"cert-manager-79c8d999ff-lhc2d\" (UID: \"47b4c1fd-4418-4d96-aeb6-6b126d6f293b\") " pod="cert-manager/cert-manager-79c8d999ff-lhc2d" Apr 21 15:03:05.650034 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:05.649996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47b4c1fd-4418-4d96-aeb6-6b126d6f293b-bound-sa-token\") pod \"cert-manager-79c8d999ff-lhc2d\" (UID: \"47b4c1fd-4418-4d96-aeb6-6b126d6f293b\") " pod="cert-manager/cert-manager-79c8d999ff-lhc2d" Apr 21 15:03:05.650219 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:05.650199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5z82\" (UniqueName: \"kubernetes.io/projected/47b4c1fd-4418-4d96-aeb6-6b126d6f293b-kube-api-access-l5z82\") pod \"cert-manager-79c8d999ff-lhc2d\" (UID: \"47b4c1fd-4418-4d96-aeb6-6b126d6f293b\") " pod="cert-manager/cert-manager-79c8d999ff-lhc2d" Apr 21 15:03:05.756368 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:05.756340 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-lhc2d" Apr 21 15:03:05.883489 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:05.883467 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-lhc2d"] Apr 21 15:03:05.885344 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:03:05.885315 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47b4c1fd_4418_4d96_aeb6_6b126d6f293b.slice/crio-653a5abd66b2406628945d05f1b12a0065e4bb9c3e0f4266d88e582620263293 WatchSource:0}: Error finding container 653a5abd66b2406628945d05f1b12a0065e4bb9c3e0f4266d88e582620263293: Status 404 returned error can't find the container with id 653a5abd66b2406628945d05f1b12a0065e4bb9c3e0f4266d88e582620263293 Apr 21 15:03:06.886204 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:06.886171 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-lhc2d" event={"ID":"47b4c1fd-4418-4d96-aeb6-6b126d6f293b","Type":"ContainerStarted","Data":"beb9d5425ee7016b9ad1a460a17a73492dcac99f2daefef1753c899cd939a2f5"} Apr 21 15:03:06.886204 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:06.886208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-lhc2d" event={"ID":"47b4c1fd-4418-4d96-aeb6-6b126d6f293b","Type":"ContainerStarted","Data":"653a5abd66b2406628945d05f1b12a0065e4bb9c3e0f4266d88e582620263293"} Apr 21 15:03:06.904631 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:06.904585 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-lhc2d" podStartSLOduration=1.904572274 podStartE2EDuration="1.904572274s" podCreationTimestamp="2026-04-21 15:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:03:06.903183106 +0000 UTC m=+433.038160715" watchObservedRunningTime="2026-04-21 15:03:06.904572274 +0000 UTC m=+433.039549883" Apr 21 15:03:18.955204 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:18.955122 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq"] Apr 21 15:03:18.960186 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:18.960162 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" Apr 21 15:03:18.971276 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:18.971234 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-86sts\"" Apr 21 15:03:18.971276 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:18.971267 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 15:03:18.971417 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:18.971373 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 15:03:18.972576 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:18.972563 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 15:03:18.974763 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:18.974746 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 15:03:18.987646 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:18.987624 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq"] Apr 21 15:03:19.063561 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:19.063533 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2e64906-90da-499c-b2bf-68ff27e32f24-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrjzq\" (UID: \"d2e64906-90da-499c-b2bf-68ff27e32f24\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" Apr 21 15:03:19.063689 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:19.063567 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw5pz\" (UniqueName: \"kubernetes.io/projected/d2e64906-90da-499c-b2bf-68ff27e32f24-kube-api-access-jw5pz\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrjzq\" (UID: \"d2e64906-90da-499c-b2bf-68ff27e32f24\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" Apr 21 15:03:19.063689 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:19.063638 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2e64906-90da-499c-b2bf-68ff27e32f24-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrjzq\" (UID: \"d2e64906-90da-499c-b2bf-68ff27e32f24\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" Apr 21 15:03:19.164435 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:19.164408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2e64906-90da-499c-b2bf-68ff27e32f24-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrjzq\" (UID: \"d2e64906-90da-499c-b2bf-68ff27e32f24\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" Apr 21 15:03:19.164595 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:19.164440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw5pz\" (UniqueName: \"kubernetes.io/projected/d2e64906-90da-499c-b2bf-68ff27e32f24-kube-api-access-jw5pz\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrjzq\" (UID: \"d2e64906-90da-499c-b2bf-68ff27e32f24\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" Apr 21 15:03:19.164595 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:19.164579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2e64906-90da-499c-b2bf-68ff27e32f24-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrjzq\" (UID: \"d2e64906-90da-499c-b2bf-68ff27e32f24\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" Apr 21 15:03:19.167039 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:19.167016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2e64906-90da-499c-b2bf-68ff27e32f24-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrjzq\" (UID: \"d2e64906-90da-499c-b2bf-68ff27e32f24\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" Apr 21 15:03:19.167137 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:19.167016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2e64906-90da-499c-b2bf-68ff27e32f24-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrjzq\" (UID: \"d2e64906-90da-499c-b2bf-68ff27e32f24\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" Apr 21 15:03:19.179524 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:19.179503 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw5pz\" (UniqueName: \"kubernetes.io/projected/d2e64906-90da-499c-b2bf-68ff27e32f24-kube-api-access-jw5pz\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrjzq\" (UID: \"d2e64906-90da-499c-b2bf-68ff27e32f24\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" Apr 21 15:03:19.271025 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:19.270990 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" Apr 21 15:03:19.434898 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:19.434716 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq"] Apr 21 15:03:19.437806 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:03:19.437776 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2e64906_90da_499c_b2bf_68ff27e32f24.slice/crio-208d4f7bbc48fc1b29abc9f3075e255292d47e8e7cfde87b4a2d7956acaba9d2 WatchSource:0}: Error finding container 208d4f7bbc48fc1b29abc9f3075e255292d47e8e7cfde87b4a2d7956acaba9d2: Status 404 returned error can't find the container with id 208d4f7bbc48fc1b29abc9f3075e255292d47e8e7cfde87b4a2d7956acaba9d2 Apr 21 15:03:19.931212 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:19.931170 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" event={"ID":"d2e64906-90da-499c-b2bf-68ff27e32f24","Type":"ContainerStarted","Data":"208d4f7bbc48fc1b29abc9f3075e255292d47e8e7cfde87b4a2d7956acaba9d2"} Apr 21 15:03:22.943610 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:22.943568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" event={"ID":"d2e64906-90da-499c-b2bf-68ff27e32f24","Type":"ContainerStarted","Data":"407bcfed0c366c128281ae5d61335c5c231392de61d0a3998916c2c5ef75292f"} Apr 21 15:03:22.944036 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:22.943708 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" Apr 21 15:03:22.964687 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:22.964639 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" podStartSLOduration=2.477719354 podStartE2EDuration="4.964623654s" podCreationTimestamp="2026-04-21 15:03:18 +0000 UTC" firstStartedPulling="2026-04-21 15:03:19.439442166 +0000 UTC m=+445.574419753" lastFinishedPulling="2026-04-21 15:03:21.926346462 +0000 UTC m=+448.061324053" observedRunningTime="2026-04-21 15:03:22.963151227 +0000 UTC m=+449.098128836" watchObservedRunningTime="2026-04-21 15:03:22.964623654 +0000 UTC m=+449.099601263" Apr 21 15:03:33.949313 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:33.949281 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrjzq" Apr 21 15:03:39.196643 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.196598 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6"] Apr 21 15:03:39.200634 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.200610 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6" Apr 21 15:03:39.203293 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.203224 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-k7m4x\"" Apr 21 15:03:39.203433 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.203295 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 15:03:39.203433 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.203230 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 15:03:39.203433 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.203272 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 15:03:39.203433 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.203224 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 15:03:39.213272 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.213232 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6"] Apr 21 15:03:39.233991 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.233955 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdxzm\" (UniqueName: \"kubernetes.io/projected/ac03d2a6-4496-4c70-a2e0-73f30da8e8af-kube-api-access-rdxzm\") pod \"kube-auth-proxy-5bcc894b57-mjff6\" (UID: \"ac03d2a6-4496-4c70-a2e0-73f30da8e8af\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6" Apr 21 15:03:39.234126 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.234020 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ac03d2a6-4496-4c70-a2e0-73f30da8e8af-tmp\") pod \"kube-auth-proxy-5bcc894b57-mjff6\" (UID: \"ac03d2a6-4496-4c70-a2e0-73f30da8e8af\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6" Apr 21 15:03:39.234126 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.234063 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac03d2a6-4496-4c70-a2e0-73f30da8e8af-tls-certs\") pod \"kube-auth-proxy-5bcc894b57-mjff6\" (UID: \"ac03d2a6-4496-4c70-a2e0-73f30da8e8af\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6" Apr 21 15:03:39.334710 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.334667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdxzm\" (UniqueName: \"kubernetes.io/projected/ac03d2a6-4496-4c70-a2e0-73f30da8e8af-kube-api-access-rdxzm\") pod \"kube-auth-proxy-5bcc894b57-mjff6\" (UID: \"ac03d2a6-4496-4c70-a2e0-73f30da8e8af\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6" Apr 21 15:03:39.334901 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.334747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ac03d2a6-4496-4c70-a2e0-73f30da8e8af-tmp\") pod \"kube-auth-proxy-5bcc894b57-mjff6\" (UID: \"ac03d2a6-4496-4c70-a2e0-73f30da8e8af\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6" Apr 21 15:03:39.334901 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.334774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac03d2a6-4496-4c70-a2e0-73f30da8e8af-tls-certs\") pod \"kube-auth-proxy-5bcc894b57-mjff6\" (UID: \"ac03d2a6-4496-4c70-a2e0-73f30da8e8af\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6" Apr 21 15:03:39.337141 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.337113 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ac03d2a6-4496-4c70-a2e0-73f30da8e8af-tmp\") pod \"kube-auth-proxy-5bcc894b57-mjff6\" (UID: \"ac03d2a6-4496-4c70-a2e0-73f30da8e8af\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6" Apr 21 15:03:39.337385 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.337365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac03d2a6-4496-4c70-a2e0-73f30da8e8af-tls-certs\") pod \"kube-auth-proxy-5bcc894b57-mjff6\" (UID: \"ac03d2a6-4496-4c70-a2e0-73f30da8e8af\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6" Apr 21 15:03:39.342519 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.342496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdxzm\" (UniqueName: \"kubernetes.io/projected/ac03d2a6-4496-4c70-a2e0-73f30da8e8af-kube-api-access-rdxzm\") pod \"kube-auth-proxy-5bcc894b57-mjff6\" (UID: \"ac03d2a6-4496-4c70-a2e0-73f30da8e8af\") " pod="openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6" Apr 21 15:03:39.510574 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.510530 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6" Apr 21 15:03:39.643386 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:39.643325 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6"] Apr 21 15:03:39.645525 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:03:39.645496 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac03d2a6_4496_4c70_a2e0_73f30da8e8af.slice/crio-4a245446fe0a1e77f84279887670ea768062b3e54f6803d30caf8415f7305285 WatchSource:0}: Error finding container 4a245446fe0a1e77f84279887670ea768062b3e54f6803d30caf8415f7305285: Status 404 returned error can't find the container with id 4a245446fe0a1e77f84279887670ea768062b3e54f6803d30caf8415f7305285 Apr 21 15:03:40.005400 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:40.005362 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6" event={"ID":"ac03d2a6-4496-4c70-a2e0-73f30da8e8af","Type":"ContainerStarted","Data":"4a245446fe0a1e77f84279887670ea768062b3e54f6803d30caf8415f7305285"} Apr 21 15:03:40.285999 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:40.285917 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7hwfx"] Apr 21 15:03:40.289356 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:40.289338 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" Apr 21 15:03:40.292836 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:40.292803 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 21 15:03:40.292974 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:40.292845 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-lhcxt\"" Apr 21 15:03:40.302327 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:40.302299 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7hwfx"] Apr 21 15:03:40.343935 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:40.343899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn7s2\" (UniqueName: \"kubernetes.io/projected/e6be0935-dd7b-426e-9d30-cb85b0ce12b3-kube-api-access-cn7s2\") pod \"odh-model-controller-858dbf95b8-7hwfx\" (UID: \"e6be0935-dd7b-426e-9d30-cb85b0ce12b3\") " pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" Apr 21 15:03:40.344106 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:40.343960 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6be0935-dd7b-426e-9d30-cb85b0ce12b3-cert\") pod \"odh-model-controller-858dbf95b8-7hwfx\" (UID: \"e6be0935-dd7b-426e-9d30-cb85b0ce12b3\") " pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" Apr 21 15:03:40.444935 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:40.444903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cn7s2\" (UniqueName: \"kubernetes.io/projected/e6be0935-dd7b-426e-9d30-cb85b0ce12b3-kube-api-access-cn7s2\") pod \"odh-model-controller-858dbf95b8-7hwfx\" (UID: \"e6be0935-dd7b-426e-9d30-cb85b0ce12b3\") " pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" Apr 21 15:03:40.445136 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:40.444957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6be0935-dd7b-426e-9d30-cb85b0ce12b3-cert\") pod \"odh-model-controller-858dbf95b8-7hwfx\" (UID: \"e6be0935-dd7b-426e-9d30-cb85b0ce12b3\") " pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" Apr 21 15:03:40.445136 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:03:40.445090 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 15:03:40.445273 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:03:40.445147 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6be0935-dd7b-426e-9d30-cb85b0ce12b3-cert podName:e6be0935-dd7b-426e-9d30-cb85b0ce12b3 nodeName:}" failed. No retries permitted until 2026-04-21 15:03:40.945127887 +0000 UTC m=+467.080105473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e6be0935-dd7b-426e-9d30-cb85b0ce12b3-cert") pod "odh-model-controller-858dbf95b8-7hwfx" (UID: "e6be0935-dd7b-426e-9d30-cb85b0ce12b3") : secret "odh-model-controller-webhook-cert" not found Apr 21 15:03:40.454376 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:40.454349 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn7s2\" (UniqueName: \"kubernetes.io/projected/e6be0935-dd7b-426e-9d30-cb85b0ce12b3-kube-api-access-cn7s2\") pod \"odh-model-controller-858dbf95b8-7hwfx\" (UID: \"e6be0935-dd7b-426e-9d30-cb85b0ce12b3\") " pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" Apr 21 15:03:40.951471 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:40.951419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6be0935-dd7b-426e-9d30-cb85b0ce12b3-cert\") pod \"odh-model-controller-858dbf95b8-7hwfx\" (UID: \"e6be0935-dd7b-426e-9d30-cb85b0ce12b3\") " pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" Apr 21 15:03:40.951644 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:03:40.951580 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 15:03:40.951688 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:03:40.951649 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6be0935-dd7b-426e-9d30-cb85b0ce12b3-cert podName:e6be0935-dd7b-426e-9d30-cb85b0ce12b3 nodeName:}" failed. No retries permitted until 2026-04-21 15:03:41.951632222 +0000 UTC m=+468.086609808 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e6be0935-dd7b-426e-9d30-cb85b0ce12b3-cert") pod "odh-model-controller-858dbf95b8-7hwfx" (UID: "e6be0935-dd7b-426e-9d30-cb85b0ce12b3") : secret "odh-model-controller-webhook-cert" not found Apr 21 15:03:41.960727 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:41.960692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6be0935-dd7b-426e-9d30-cb85b0ce12b3-cert\") pod \"odh-model-controller-858dbf95b8-7hwfx\" (UID: \"e6be0935-dd7b-426e-9d30-cb85b0ce12b3\") " pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" Apr 21 15:03:41.963154 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:41.963127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6be0935-dd7b-426e-9d30-cb85b0ce12b3-cert\") pod \"odh-model-controller-858dbf95b8-7hwfx\" (UID: \"e6be0935-dd7b-426e-9d30-cb85b0ce12b3\") " pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" Apr 21 15:03:42.103423 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:42.103389 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" Apr 21 15:03:42.444065 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:42.444027 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7hwfx"] Apr 21 15:03:42.447315 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:03:42.447283 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6be0935_dd7b_426e_9d30_cb85b0ce12b3.slice/crio-9d05b53f7463d10c028b8765e73c31d04108d882571abe90f2a88a4dc3682035 WatchSource:0}: Error finding container 9d05b53f7463d10c028b8765e73c31d04108d882571abe90f2a88a4dc3682035: Status 404 returned error can't find the container with id 9d05b53f7463d10c028b8765e73c31d04108d882571abe90f2a88a4dc3682035 Apr 21 15:03:43.019740 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:43.019699 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" event={"ID":"e6be0935-dd7b-426e-9d30-cb85b0ce12b3","Type":"ContainerStarted","Data":"9d05b53f7463d10c028b8765e73c31d04108d882571abe90f2a88a4dc3682035"} Apr 21 15:03:47.037503 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:47.037457 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6" event={"ID":"ac03d2a6-4496-4c70-a2e0-73f30da8e8af","Type":"ContainerStarted","Data":"50d366f260f54b555680b01a96585b6693678ad50cf78f50ce1170eefaf380e6"} Apr 21 15:03:47.040127 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:47.040093 2576 generic.go:358] "Generic (PLEG): container finished" podID="e6be0935-dd7b-426e-9d30-cb85b0ce12b3" containerID="62a36c265fc4b7676b20d2688e5213e8b95536b5d59c2bffd653d8b6b52c5f64" exitCode=1 Apr 21 15:03:47.040342 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:47.040275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" event={"ID":"e6be0935-dd7b-426e-9d30-cb85b0ce12b3","Type":"ContainerDied","Data":"62a36c265fc4b7676b20d2688e5213e8b95536b5d59c2bffd653d8b6b52c5f64"} Apr 21 15:03:47.040707 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:47.040674 2576 scope.go:117] "RemoveContainer" containerID="62a36c265fc4b7676b20d2688e5213e8b95536b5d59c2bffd653d8b6b52c5f64" Apr 21 15:03:47.104726 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:47.104671 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-5bcc894b57-mjff6" podStartSLOduration=1.623061178 podStartE2EDuration="8.104656039s" podCreationTimestamp="2026-04-21 15:03:39 +0000 UTC" firstStartedPulling="2026-04-21 15:03:39.647315779 +0000 UTC m=+465.782293366" lastFinishedPulling="2026-04-21 15:03:46.128910641 +0000 UTC m=+472.263888227" observedRunningTime="2026-04-21 15:03:47.09999935 +0000 UTC m=+473.234976958" watchObservedRunningTime="2026-04-21 15:03:47.104656039 +0000 UTC m=+473.239633648" Apr 21 15:03:47.410207 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:47.410169 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-khrqf"] Apr 21 15:03:47.413644 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:47.413617 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-khrqf" Apr 21 15:03:47.417747 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:47.417712 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-btn9z\"" Apr 21 15:03:47.417873 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:47.417844 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 21 15:03:47.437282 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:47.437142 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-khrqf"] Apr 21 15:03:47.519516 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:47.519476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psjxt\" (UniqueName: \"kubernetes.io/projected/eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4-kube-api-access-psjxt\") pod \"kserve-controller-manager-856948b99f-khrqf\" (UID: \"eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4\") " pod="opendatahub/kserve-controller-manager-856948b99f-khrqf" Apr 21 15:03:47.519708 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:47.519620 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4-cert\") pod \"kserve-controller-manager-856948b99f-khrqf\" (UID: \"eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4\") " pod="opendatahub/kserve-controller-manager-856948b99f-khrqf" Apr 21 15:03:47.620665 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:47.620584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4-cert\") pod \"kserve-controller-manager-856948b99f-khrqf\" (UID: \"eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4\") " pod="opendatahub/kserve-controller-manager-856948b99f-khrqf" Apr 21 15:03:47.620665 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:47.620654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psjxt\" (UniqueName: \"kubernetes.io/projected/eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4-kube-api-access-psjxt\") pod \"kserve-controller-manager-856948b99f-khrqf\" (UID: \"eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4\") " pod="opendatahub/kserve-controller-manager-856948b99f-khrqf" Apr 21 15:03:47.620896 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:03:47.620699 2576 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 21 15:03:47.620896 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:03:47.620766 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4-cert podName:eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4 nodeName:}" failed. No retries permitted until 2026-04-21 15:03:48.120749924 +0000 UTC m=+474.255727511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4-cert") pod "kserve-controller-manager-856948b99f-khrqf" (UID: "eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4") : secret "kserve-webhook-server-cert" not found Apr 21 15:03:47.648744 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:47.648714 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psjxt\" (UniqueName: \"kubernetes.io/projected/eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4-kube-api-access-psjxt\") pod \"kserve-controller-manager-856948b99f-khrqf\" (UID: \"eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4\") " pod="opendatahub/kserve-controller-manager-856948b99f-khrqf" Apr 21 15:03:48.045927 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:48.045896 2576 generic.go:358] "Generic (PLEG): container finished" podID="e6be0935-dd7b-426e-9d30-cb85b0ce12b3" containerID="419ce430ff8f74daf5ba33fabdba8fe8d42b3a11ef441e3b21c2a185749b0cea" exitCode=1 Apr 21 15:03:48.046401 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:48.045988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" event={"ID":"e6be0935-dd7b-426e-9d30-cb85b0ce12b3","Type":"ContainerDied","Data":"419ce430ff8f74daf5ba33fabdba8fe8d42b3a11ef441e3b21c2a185749b0cea"} Apr 21 15:03:48.046401 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:48.046029 2576 scope.go:117] "RemoveContainer" containerID="62a36c265fc4b7676b20d2688e5213e8b95536b5d59c2bffd653d8b6b52c5f64" Apr 21 15:03:48.046401 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:48.046208 2576 scope.go:117] "RemoveContainer" containerID="419ce430ff8f74daf5ba33fabdba8fe8d42b3a11ef441e3b21c2a185749b0cea" Apr 21 15:03:48.046516 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:03:48.046455 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-7hwfx_opendatahub(e6be0935-dd7b-426e-9d30-cb85b0ce12b3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" podUID="e6be0935-dd7b-426e-9d30-cb85b0ce12b3" Apr 21 15:03:48.126646 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:48.126598 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4-cert\") pod \"kserve-controller-manager-856948b99f-khrqf\" (UID: \"eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4\") " pod="opendatahub/kserve-controller-manager-856948b99f-khrqf" Apr 21 15:03:48.129520 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:48.129495 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4-cert\") pod \"kserve-controller-manager-856948b99f-khrqf\" (UID: \"eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4\") " pod="opendatahub/kserve-controller-manager-856948b99f-khrqf" Apr 21 15:03:48.324471 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:48.324382 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-khrqf" Apr 21 15:03:48.453382 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:48.453100 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-khrqf"] Apr 21 15:03:48.455509 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:03:48.455484 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb6ef7aa_1de2_4b0f_abf1_895c0c7688a4.slice/crio-9132493a5eaf60c5161920b865fe16dfa36e3c208e197551275c3802f538ead6 WatchSource:0}: Error finding container 9132493a5eaf60c5161920b865fe16dfa36e3c208e197551275c3802f538ead6: Status 404 returned error can't find the container with id 9132493a5eaf60c5161920b865fe16dfa36e3c208e197551275c3802f538ead6 Apr 21 15:03:49.051949 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:49.051901 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-khrqf" event={"ID":"eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4","Type":"ContainerStarted","Data":"9132493a5eaf60c5161920b865fe16dfa36e3c208e197551275c3802f538ead6"} Apr 21 15:03:49.053442 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:49.053423 2576 scope.go:117] "RemoveContainer" containerID="419ce430ff8f74daf5ba33fabdba8fe8d42b3a11ef441e3b21c2a185749b0cea" Apr 21 15:03:49.053610 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:03:49.053596 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-7hwfx_opendatahub(e6be0935-dd7b-426e-9d30-cb85b0ce12b3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" podUID="e6be0935-dd7b-426e-9d30-cb85b0ce12b3" Apr 21 15:03:52.067425 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:52.067389 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-khrqf" event={"ID":"eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4","Type":"ContainerStarted","Data":"dfdf2687dc97c703b97059ae26dea22ee94cb113017ef2d9acc55a6747cb47f1"} Apr 21 15:03:52.067854 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:52.067603 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-khrqf" Apr 21 15:03:52.103691 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:52.103648 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" Apr 21 15:03:52.104153 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:52.104138 2576 scope.go:117] "RemoveContainer" containerID="419ce430ff8f74daf5ba33fabdba8fe8d42b3a11ef441e3b21c2a185749b0cea" Apr 21 15:03:52.104413 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:03:52.104392 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-7hwfx_opendatahub(e6be0935-dd7b-426e-9d30-cb85b0ce12b3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" podUID="e6be0935-dd7b-426e-9d30-cb85b0ce12b3" Apr 21 15:03:52.135337 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:52.135293 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-khrqf" podStartSLOduration=2.352080728 podStartE2EDuration="5.135278974s" podCreationTimestamp="2026-04-21 15:03:47 +0000 UTC" firstStartedPulling="2026-04-21 15:03:48.45715085 +0000 UTC m=+474.592128438" lastFinishedPulling="2026-04-21 15:03:51.240349095 +0000 UTC m=+477.375326684" observedRunningTime="2026-04-21 15:03:52.1325016 +0000 UTC m=+478.267479208" watchObservedRunningTime="2026-04-21 15:03:52.135278974 +0000 UTC m=+478.270256582" Apr 21 15:03:53.305142 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:53.305095 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-vvn88"] Apr 21 15:03:53.308816 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:53.308791 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vvn88" Apr 21 15:03:53.312370 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:53.312348 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-k7tps\"" Apr 21 15:03:53.312727 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:53.312713 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 21 15:03:53.313200 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:53.313182 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 21 15:03:53.328162 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:53.328127 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-vvn88"] Apr 21 15:03:53.377592 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:53.377558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/8f13475a-aa71-4706-b94d-3b2b1a98a319-operator-config\") pod \"servicemesh-operator3-55f49c5f94-vvn88\" (UID: \"8f13475a-aa71-4706-b94d-3b2b1a98a319\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vvn88" Apr 21 15:03:53.377747 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:53.377632 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6vts\" (UniqueName: \"kubernetes.io/projected/8f13475a-aa71-4706-b94d-3b2b1a98a319-kube-api-access-l6vts\") pod \"servicemesh-operator3-55f49c5f94-vvn88\" (UID: \"8f13475a-aa71-4706-b94d-3b2b1a98a319\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vvn88" Apr 21 15:03:53.478889 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:53.478855 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6vts\" (UniqueName: \"kubernetes.io/projected/8f13475a-aa71-4706-b94d-3b2b1a98a319-kube-api-access-l6vts\") pod \"servicemesh-operator3-55f49c5f94-vvn88\" (UID: \"8f13475a-aa71-4706-b94d-3b2b1a98a319\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vvn88" Apr 21 15:03:53.479076 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:53.478923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/8f13475a-aa71-4706-b94d-3b2b1a98a319-operator-config\") pod \"servicemesh-operator3-55f49c5f94-vvn88\" (UID: \"8f13475a-aa71-4706-b94d-3b2b1a98a319\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vvn88" Apr 21 15:03:53.481675 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:53.481649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/8f13475a-aa71-4706-b94d-3b2b1a98a319-operator-config\") pod \"servicemesh-operator3-55f49c5f94-vvn88\" (UID: \"8f13475a-aa71-4706-b94d-3b2b1a98a319\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vvn88" Apr 21 15:03:53.489261 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:53.489208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6vts\" (UniqueName: \"kubernetes.io/projected/8f13475a-aa71-4706-b94d-3b2b1a98a319-kube-api-access-l6vts\") pod \"servicemesh-operator3-55f49c5f94-vvn88\" (UID: \"8f13475a-aa71-4706-b94d-3b2b1a98a319\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vvn88" Apr 21 15:03:53.618646 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:53.618566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vvn88" Apr 21 15:03:53.749572 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:53.749546 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-vvn88"] Apr 21 15:03:53.751726 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:03:53.751694 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f13475a_aa71_4706_b94d_3b2b1a98a319.slice/crio-175a7b0f8477111539348908b53e6552bf5c496d33cb294418ff44a6533ec495 WatchSource:0}: Error finding container 175a7b0f8477111539348908b53e6552bf5c496d33cb294418ff44a6533ec495: Status 404 returned error can't find the container with id 175a7b0f8477111539348908b53e6552bf5c496d33cb294418ff44a6533ec495 Apr 21 15:03:54.077148 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:54.077110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vvn88" event={"ID":"8f13475a-aa71-4706-b94d-3b2b1a98a319","Type":"ContainerStarted","Data":"175a7b0f8477111539348908b53e6552bf5c496d33cb294418ff44a6533ec495"} Apr 21 15:03:59.103526 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.103426 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vvn88" event={"ID":"8f13475a-aa71-4706-b94d-3b2b1a98a319","Type":"ContainerStarted","Data":"e4cb9b948f5c93ec5dd41f9d95fcdbb13b9fe036748b77a060e9197767020b65"} Apr 21 15:03:59.103526 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.103499 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vvn88" Apr 21 15:03:59.126939 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.126883 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vvn88" podStartSLOduration=1.049015178 podStartE2EDuration="6.126865824s" podCreationTimestamp="2026-04-21 15:03:53 +0000 UTC" firstStartedPulling="2026-04-21 15:03:53.754302535 +0000 UTC m=+479.889280123" lastFinishedPulling="2026-04-21 15:03:58.832153179 +0000 UTC m=+484.967130769" observedRunningTime="2026-04-21 15:03:59.125135589 +0000 UTC m=+485.260113211" watchObservedRunningTime="2026-04-21 15:03:59.126865824 +0000 UTC m=+485.261843434" Apr 21 15:03:59.839666 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.839623 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh"] Apr 21 15:03:59.843106 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.843081 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:03:59.845597 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.845577 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 21 15:03:59.845780 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.845754 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-bjftg\"" Apr 21 15:03:59.845878 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.845797 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 21 15:03:59.846371 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.846352 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 15:03:59.846371 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.846366 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 21 15:03:59.861341 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.861320 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh"] Apr 21 15:03:59.942606 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.942572 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj8t4\" (UniqueName: \"kubernetes.io/projected/507869c0-c040-4125-b81d-d57c30d60623-kube-api-access-fj8t4\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:03:59.942773 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.942619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/507869c0-c040-4125-b81d-d57c30d60623-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:03:59.942773 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.942666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/507869c0-c040-4125-b81d-d57c30d60623-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:03:59.942773 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.942709 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/507869c0-c040-4125-b81d-d57c30d60623-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:03:59.942773 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.942755 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/507869c0-c040-4125-b81d-d57c30d60623-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:03:59.942938 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.942781 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/507869c0-c040-4125-b81d-d57c30d60623-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:03:59.942938 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:03:59.942800 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/507869c0-c040-4125-b81d-d57c30d60623-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.043620 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.043581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/507869c0-c040-4125-b81d-d57c30d60623-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.043765 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.043625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/507869c0-c040-4125-b81d-d57c30d60623-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.043765 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.043654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/507869c0-c040-4125-b81d-d57c30d60623-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.043765 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.043688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/507869c0-c040-4125-b81d-d57c30d60623-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.043765 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.043723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/507869c0-c040-4125-b81d-d57c30d60623-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.043765 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.043750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/507869c0-c040-4125-b81d-d57c30d60623-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.044034 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.043838 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fj8t4\" (UniqueName: \"kubernetes.io/projected/507869c0-c040-4125-b81d-d57c30d60623-kube-api-access-fj8t4\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.044340 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.044311 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/507869c0-c040-4125-b81d-d57c30d60623-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.046319 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.046288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/507869c0-c040-4125-b81d-d57c30d60623-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.046440 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.046340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/507869c0-c040-4125-b81d-d57c30d60623-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.046544 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.046524 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/507869c0-c040-4125-b81d-d57c30d60623-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.046687 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.046669 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/507869c0-c040-4125-b81d-d57c30d60623-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.058197 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.058168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj8t4\" (UniqueName: \"kubernetes.io/projected/507869c0-c040-4125-b81d-d57c30d60623-kube-api-access-fj8t4\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.059136 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.059114 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/507869c0-c040-4125-b81d-d57c30d60623-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-g5mdh\" (UID: \"507869c0-c040-4125-b81d-d57c30d60623\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.153439 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.153347 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:00.296100 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:00.296077 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh"] Apr 21 15:04:00.297438 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:04:00.297406 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod507869c0_c040_4125_b81d_d57c30d60623.slice/crio-4990a0f84d91869975388359e1994a2bad8240ce4ed67cae6203a0560b878cb0 WatchSource:0}: Error finding container 4990a0f84d91869975388359e1994a2bad8240ce4ed67cae6203a0560b878cb0: Status 404 returned error can't find the container with id 4990a0f84d91869975388359e1994a2bad8240ce4ed67cae6203a0560b878cb0 Apr 21 15:04:01.111732 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:01.111691 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" event={"ID":"507869c0-c040-4125-b81d-d57c30d60623","Type":"ContainerStarted","Data":"4990a0f84d91869975388359e1994a2bad8240ce4ed67cae6203a0560b878cb0"} Apr 21 15:04:02.104281 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:02.104227 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" Apr 21 15:04:02.104713 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:02.104696 2576 scope.go:117] "RemoveContainer" containerID="419ce430ff8f74daf5ba33fabdba8fe8d42b3a11ef441e3b21c2a185749b0cea" Apr 21 15:04:03.122422 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:03.122382 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" event={"ID":"e6be0935-dd7b-426e-9d30-cb85b0ce12b3","Type":"ContainerStarted","Data":"9af15b08329c628bcaf300ae5774bf2ce77423d88dab16b4b33bbc0460228f26"} Apr 21 15:04:03.122989 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:03.122616 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" Apr 21 15:04:03.145781 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:03.145693 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" podStartSLOduration=3.18924509 podStartE2EDuration="23.145672225s" podCreationTimestamp="2026-04-21 15:03:40 +0000 UTC" firstStartedPulling="2026-04-21 15:03:42.449505903 +0000 UTC m=+468.584483497" lastFinishedPulling="2026-04-21 15:04:02.405933032 +0000 UTC m=+488.540910632" observedRunningTime="2026-04-21 15:04:03.142738723 +0000 UTC m=+489.277716316" watchObservedRunningTime="2026-04-21 15:04:03.145672225 +0000 UTC m=+489.280649835" Apr 21 15:04:03.575556 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:03.575519 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 21 15:04:03.575652 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:03.575584 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 21 15:04:04.130732 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:04.130672 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" event={"ID":"507869c0-c040-4125-b81d-d57c30d60623","Type":"ContainerStarted","Data":"88cf832b54110df9858241f76c3898d92c93ac8d329cc3a89302825b59c2f095"} Apr 21 15:04:04.131305 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:04.131282 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:04.133033 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:04.133003 2576 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-g5mdh container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 21 15:04:04.133188 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:04.133059 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" podUID="507869c0-c040-4125-b81d-d57c30d60623" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:04:04.155374 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:04.155052 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" podStartSLOduration=1.879173822 podStartE2EDuration="5.155036356s" podCreationTimestamp="2026-04-21 15:03:59 +0000 UTC" firstStartedPulling="2026-04-21 15:04:00.299422597 +0000 UTC m=+486.434400201" lastFinishedPulling="2026-04-21 15:04:03.575285146 +0000 UTC m=+489.710262735" observedRunningTime="2026-04-21 15:04:04.154209419 +0000 UTC m=+490.289187029" watchObservedRunningTime="2026-04-21 15:04:04.155036356 +0000 UTC m=+490.290013968" Apr 21 15:04:05.136117 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:05.136073 2576 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-g5mdh container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 21 15:04:05.136540 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:05.136168 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" podUID="507869c0-c040-4125-b81d-d57c30d60623" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:04:06.139395 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:06.139366 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g5mdh" Apr 21 15:04:10.109355 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:10.109327 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vvn88" Apr 21 15:04:14.133393 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:14.133361 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-7hwfx" Apr 21 15:04:23.078039 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:04:23.078003 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-khrqf" Apr 21 15:05:08.000648 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.000541 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-fbc6bbc86-8x8m9"] Apr 21 15:05:08.003424 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.003406 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.017802 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.017774 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fbc6bbc86-8x8m9"] Apr 21 15:05:08.058538 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.058491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2adf5643-48d2-4d0b-baa0-940ed6abc933-trusted-ca-bundle\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.058538 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.058542 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2adf5643-48d2-4d0b-baa0-940ed6abc933-console-oauth-config\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.058855 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.058625 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2adf5643-48d2-4d0b-baa0-940ed6abc933-oauth-serving-cert\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.058855 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.058670 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x99v\" (UniqueName: \"kubernetes.io/projected/2adf5643-48d2-4d0b-baa0-940ed6abc933-kube-api-access-5x99v\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.058855 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.058712 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2adf5643-48d2-4d0b-baa0-940ed6abc933-console-serving-cert\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.058855 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.058745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2adf5643-48d2-4d0b-baa0-940ed6abc933-console-config\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.059071 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.058856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2adf5643-48d2-4d0b-baa0-940ed6abc933-service-ca\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.159677 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.159641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2adf5643-48d2-4d0b-baa0-940ed6abc933-service-ca\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.159870 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.159710 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2adf5643-48d2-4d0b-baa0-940ed6abc933-trusted-ca-bundle\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.159870 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.159750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2adf5643-48d2-4d0b-baa0-940ed6abc933-console-oauth-config\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.159870 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.159790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2adf5643-48d2-4d0b-baa0-940ed6abc933-oauth-serving-cert\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.159870 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.159816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5x99v\" (UniqueName: \"kubernetes.io/projected/2adf5643-48d2-4d0b-baa0-940ed6abc933-kube-api-access-5x99v\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.159870 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.159851 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2adf5643-48d2-4d0b-baa0-940ed6abc933-console-serving-cert\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.160114 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.159873 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2adf5643-48d2-4d0b-baa0-940ed6abc933-console-config\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.160537 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.160512 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2adf5643-48d2-4d0b-baa0-940ed6abc933-service-ca\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.160758 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.160703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2adf5643-48d2-4d0b-baa0-940ed6abc933-oauth-serving-cert\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.160885 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.160864 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2adf5643-48d2-4d0b-baa0-940ed6abc933-trusted-ca-bundle\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.160976 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.160915 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2adf5643-48d2-4d0b-baa0-940ed6abc933-console-config\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.162995 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.162971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2adf5643-48d2-4d0b-baa0-940ed6abc933-console-oauth-config\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.163180 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.163157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2adf5643-48d2-4d0b-baa0-940ed6abc933-console-serving-cert\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.168708 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.168690 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x99v\" (UniqueName: \"kubernetes.io/projected/2adf5643-48d2-4d0b-baa0-940ed6abc933-kube-api-access-5x99v\") pod \"console-fbc6bbc86-8x8m9\" (UID: \"2adf5643-48d2-4d0b-baa0-940ed6abc933\") " pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.313846 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.313763 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:08.441515 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:08.441489 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fbc6bbc86-8x8m9"] Apr 21 15:05:08.443930 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:05:08.443902 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2adf5643_48d2_4d0b_baa0_940ed6abc933.slice/crio-66ae91bd3e4869630b6c79c291e2f13d5ab8ff026b39cf76bb76ab08c79bba4d WatchSource:0}: Error finding container 66ae91bd3e4869630b6c79c291e2f13d5ab8ff026b39cf76bb76ab08c79bba4d: Status 404 returned error can't find the container with id 66ae91bd3e4869630b6c79c291e2f13d5ab8ff026b39cf76bb76ab08c79bba4d Apr 21 15:05:09.367208 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:09.367174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fbc6bbc86-8x8m9" event={"ID":"2adf5643-48d2-4d0b-baa0-940ed6abc933","Type":"ContainerStarted","Data":"e98cea931acd2a3d293900ef8dbefe82cf0495a4f472226a2144cdd15b947a7e"} Apr 21 15:05:09.367208 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:09.367207 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fbc6bbc86-8x8m9" event={"ID":"2adf5643-48d2-4d0b-baa0-940ed6abc933","Type":"ContainerStarted","Data":"66ae91bd3e4869630b6c79c291e2f13d5ab8ff026b39cf76bb76ab08c79bba4d"} Apr 21 15:05:09.386907 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:09.386853 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-fbc6bbc86-8x8m9" podStartSLOduration=2.386832748 podStartE2EDuration="2.386832748s" podCreationTimestamp="2026-04-21 15:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:05:09.385883146 +0000 UTC m=+555.520860779" watchObservedRunningTime="2026-04-21 15:05:09.386832748 +0000 UTC m=+555.521810357" Apr 21 15:05:17.858312 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:17.858275 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88"] Apr 21 15:05:17.860975 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:17.860954 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" Apr 21 15:05:17.863344 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:17.863320 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 15:05:17.863495 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:17.863357 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 15:05:17.863615 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:17.863601 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-lgvpp\"" Apr 21 15:05:17.890379 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:17.890342 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88"] Apr 21 15:05:17.950186 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:17.950153 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4ec108d1-5289-41b1-964f-e7a95ba88aa3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-j7j88\" (UID: \"4ec108d1-5289-41b1-964f-e7a95ba88aa3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" Apr 21 15:05:17.950186 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:17.950190 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zbj7\" (UniqueName: \"kubernetes.io/projected/4ec108d1-5289-41b1-964f-e7a95ba88aa3-kube-api-access-4zbj7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-j7j88\" (UID: \"4ec108d1-5289-41b1-964f-e7a95ba88aa3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" Apr 21 15:05:18.051639 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:18.051608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4ec108d1-5289-41b1-964f-e7a95ba88aa3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-j7j88\" (UID: \"4ec108d1-5289-41b1-964f-e7a95ba88aa3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" Apr 21 15:05:18.051639 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:18.051641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zbj7\" (UniqueName: \"kubernetes.io/projected/4ec108d1-5289-41b1-964f-e7a95ba88aa3-kube-api-access-4zbj7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-j7j88\" (UID: \"4ec108d1-5289-41b1-964f-e7a95ba88aa3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" Apr 21 15:05:18.051983 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:18.051961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4ec108d1-5289-41b1-964f-e7a95ba88aa3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-j7j88\" (UID: \"4ec108d1-5289-41b1-964f-e7a95ba88aa3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" Apr 21 15:05:18.064538 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:18.064510 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zbj7\" (UniqueName: \"kubernetes.io/projected/4ec108d1-5289-41b1-964f-e7a95ba88aa3-kube-api-access-4zbj7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-j7j88\" (UID: \"4ec108d1-5289-41b1-964f-e7a95ba88aa3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" Apr 21 15:05:18.171887 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:18.171808 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" Apr 21 15:05:18.302138 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:18.302109 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88"] Apr 21 15:05:18.304730 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:05:18.304699 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ec108d1_5289_41b1_964f_e7a95ba88aa3.slice/crio-758c162c6dbb411c26c8e662ff00e309edd1457c8af7de60e131b50c542e8228 WatchSource:0}: Error finding container 758c162c6dbb411c26c8e662ff00e309edd1457c8af7de60e131b50c542e8228: Status 404 returned error can't find the container with id 758c162c6dbb411c26c8e662ff00e309edd1457c8af7de60e131b50c542e8228 Apr 21 15:05:18.314598 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:18.314550 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:18.314598 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:18.314583 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:18.319034 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:18.319012 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:18.405857 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:18.405822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" event={"ID":"4ec108d1-5289-41b1-964f-e7a95ba88aa3","Type":"ContainerStarted","Data":"758c162c6dbb411c26c8e662ff00e309edd1457c8af7de60e131b50c542e8228"} Apr 21 15:05:18.410020 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:18.409994 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-fbc6bbc86-8x8m9" Apr 21 15:05:18.506413 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:18.506375 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56c8dc547d-jl8h8"] Apr 21 15:05:25.433911 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:25.433876 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" event={"ID":"4ec108d1-5289-41b1-964f-e7a95ba88aa3","Type":"ContainerStarted","Data":"f94a4579857beea7d50087dc868645ccb76abb700822526759ea1afb76fbc335"} Apr 21 15:05:25.434366 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:25.433951 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" Apr 21 15:05:25.452148 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:25.452104 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" podStartSLOduration=2.047392323 podStartE2EDuration="8.452090251s" podCreationTimestamp="2026-04-21 15:05:17 +0000 UTC" firstStartedPulling="2026-04-21 15:05:18.307174239 +0000 UTC m=+564.442151841" lastFinishedPulling="2026-04-21 15:05:24.711872181 +0000 UTC m=+570.846849769" observedRunningTime="2026-04-21 15:05:25.451001175 +0000 UTC m=+571.585978794" watchObservedRunningTime="2026-04-21 15:05:25.452090251 +0000 UTC m=+571.587067859" Apr 21 15:05:36.439994 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:36.439962 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" Apr 21 15:05:37.694034 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:37.693996 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj"] Apr 21 15:05:37.697164 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:37.697141 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj" Apr 21 15:05:37.709398 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:37.709371 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj"] Apr 21 15:05:37.743605 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:37.743569 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1d21af0f-bcbb-42bc-8365-c8708c699049-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-9zqpj\" (UID: \"1d21af0f-bcbb-42bc-8365-c8708c699049\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj" Apr 21 15:05:37.743803 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:37.743658 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trb55\" (UniqueName: \"kubernetes.io/projected/1d21af0f-bcbb-42bc-8365-c8708c699049-kube-api-access-trb55\") pod \"kuadrant-operator-controller-manager-84b657d985-9zqpj\" (UID: \"1d21af0f-bcbb-42bc-8365-c8708c699049\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj" Apr 21 15:05:37.844597 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:37.844560 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1d21af0f-bcbb-42bc-8365-c8708c699049-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-9zqpj\" (UID: \"1d21af0f-bcbb-42bc-8365-c8708c699049\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj" Apr 21 15:05:37.844755 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:37.844614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trb55\" (UniqueName: \"kubernetes.io/projected/1d21af0f-bcbb-42bc-8365-c8708c699049-kube-api-access-trb55\") pod \"kuadrant-operator-controller-manager-84b657d985-9zqpj\" (UID: \"1d21af0f-bcbb-42bc-8365-c8708c699049\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj" Apr 21 15:05:37.844919 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:37.844898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1d21af0f-bcbb-42bc-8365-c8708c699049-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-9zqpj\" (UID: \"1d21af0f-bcbb-42bc-8365-c8708c699049\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj" Apr 21 15:05:37.853037 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:37.853010 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trb55\" (UniqueName: \"kubernetes.io/projected/1d21af0f-bcbb-42bc-8365-c8708c699049-kube-api-access-trb55\") pod \"kuadrant-operator-controller-manager-84b657d985-9zqpj\" (UID: \"1d21af0f-bcbb-42bc-8365-c8708c699049\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj" Apr 21 15:05:38.008390 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.008355 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj" Apr 21 15:05:38.157421 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.157390 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj"] Apr 21 15:05:38.160030 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:05:38.160001 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d21af0f_bcbb_42bc_8365_c8708c699049.slice/crio-cc21e6c7bb97ac150d1acd45decc5c68c6d2f67367de23b9b7b9666ed877e26f WatchSource:0}: Error finding container cc21e6c7bb97ac150d1acd45decc5c68c6d2f67367de23b9b7b9666ed877e26f: Status 404 returned error can't find the container with id cc21e6c7bb97ac150d1acd45decc5c68c6d2f67367de23b9b7b9666ed877e26f Apr 21 15:05:38.238648 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.238618 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88"] Apr 21 15:05:38.238855 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.238829 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" podUID="4ec108d1-5289-41b1-964f-e7a95ba88aa3" containerName="manager" containerID="cri-o://f94a4579857beea7d50087dc868645ccb76abb700822526759ea1afb76fbc335" gracePeriod=2 Apr 21 15:05:38.244964 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.244936 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88"] Apr 21 15:05:38.265597 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.265532 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj"] Apr 21 15:05:38.267696 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.267675 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb"] Apr 21 15:05:38.268072 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.268061 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ec108d1-5289-41b1-964f-e7a95ba88aa3" containerName="manager" Apr 21 15:05:38.268108 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.268074 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec108d1-5289-41b1-964f-e7a95ba88aa3" containerName="manager" Apr 21 15:05:38.268152 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.268143 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ec108d1-5289-41b1-964f-e7a95ba88aa3" containerName="manager" Apr 21 15:05:38.270421 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.270397 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" Apr 21 15:05:38.277113 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.277084 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj"] Apr 21 15:05:38.282641 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.282621 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb"] Apr 21 15:05:38.290777 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.290752 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q"] Apr 21 15:05:38.291264 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.291222 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d21af0f-bcbb-42bc-8365-c8708c699049" containerName="manager" Apr 21 15:05:38.291264 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.291257 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d21af0f-bcbb-42bc-8365-c8708c699049" containerName="manager" Apr 21 15:05:38.291408 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.291344 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d21af0f-bcbb-42bc-8365-c8708c699049" containerName="manager" Apr 21 15:05:38.293960 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.293943 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" Apr 21 15:05:38.307228 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.307195 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q"] Apr 21 15:05:38.313620 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.313202 2576 status_manager.go:895] "Failed to get status for pod" podUID="4ec108d1-5289-41b1-964f-e7a95ba88aa3" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-j7j88\" is forbidden: User \"system:node:ip-10-0-130-121.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-121.ec2.internal' and this object" Apr 21 15:05:38.333541 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.333500 2576 status_manager.go:895] "Failed to get status for pod" podUID="4ec108d1-5289-41b1-964f-e7a95ba88aa3" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-j7j88\" is forbidden: User \"system:node:ip-10-0-130-121.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-121.ec2.internal' and this object" Apr 21 15:05:38.349990 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.349962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js9n8\" (UniqueName: \"kubernetes.io/projected/6576a751-7f6f-47f4-a67d-f802e918f1db-kube-api-access-js9n8\") pod \"kuadrant-operator-controller-manager-84b657d985-6xf6q\" (UID: \"6576a751-7f6f-47f4-a67d-f802e918f1db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" Apr 21 15:05:38.350097 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.349998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6576a751-7f6f-47f4-a67d-f802e918f1db-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-6xf6q\" (UID: \"6576a751-7f6f-47f4-a67d-f802e918f1db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" Apr 21 15:05:38.350097 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.350041 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56zvq\" (UniqueName: \"kubernetes.io/projected/17f24ce6-cf2f-400a-b8e1-bdd7b8495218-kube-api-access-56zvq\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-b97tb\" (UID: \"17f24ce6-cf2f-400a-b8e1-bdd7b8495218\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" Apr 21 15:05:38.350097 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.350063 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/17f24ce6-cf2f-400a-b8e1-bdd7b8495218-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-b97tb\" (UID: \"17f24ce6-cf2f-400a-b8e1-bdd7b8495218\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" Apr 21 15:05:38.450613 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.450579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/17f24ce6-cf2f-400a-b8e1-bdd7b8495218-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-b97tb\" (UID: \"17f24ce6-cf2f-400a-b8e1-bdd7b8495218\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" Apr 21 15:05:38.450763 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.450673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-js9n8\" (UniqueName: \"kubernetes.io/projected/6576a751-7f6f-47f4-a67d-f802e918f1db-kube-api-access-js9n8\") pod \"kuadrant-operator-controller-manager-84b657d985-6xf6q\" (UID: \"6576a751-7f6f-47f4-a67d-f802e918f1db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" Apr 21 15:05:38.450763 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.450705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6576a751-7f6f-47f4-a67d-f802e918f1db-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-6xf6q\" (UID: \"6576a751-7f6f-47f4-a67d-f802e918f1db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" Apr 21 15:05:38.450763 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.450754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56zvq\" (UniqueName: \"kubernetes.io/projected/17f24ce6-cf2f-400a-b8e1-bdd7b8495218-kube-api-access-56zvq\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-b97tb\" (UID: \"17f24ce6-cf2f-400a-b8e1-bdd7b8495218\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" Apr 21 15:05:38.451086 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.451023 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/17f24ce6-cf2f-400a-b8e1-bdd7b8495218-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-b97tb\" (UID: \"17f24ce6-cf2f-400a-b8e1-bdd7b8495218\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" Apr 21 15:05:38.451223 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.451198 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6576a751-7f6f-47f4-a67d-f802e918f1db-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-6xf6q\" (UID: \"6576a751-7f6f-47f4-a67d-f802e918f1db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" Apr 21 15:05:38.460957 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.460935 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-js9n8\" (UniqueName: \"kubernetes.io/projected/6576a751-7f6f-47f4-a67d-f802e918f1db-kube-api-access-js9n8\") pod \"kuadrant-operator-controller-manager-84b657d985-6xf6q\" (UID: \"6576a751-7f6f-47f4-a67d-f802e918f1db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" Apr 21 15:05:38.460957 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.460952 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56zvq\" (UniqueName: \"kubernetes.io/projected/17f24ce6-cf2f-400a-b8e1-bdd7b8495218-kube-api-access-56zvq\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-b97tb\" (UID: \"17f24ce6-cf2f-400a-b8e1-bdd7b8495218\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" Apr 21 15:05:38.465683 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.465668 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" Apr 21 15:05:38.486020 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.485993 2576 generic.go:358] "Generic (PLEG): container finished" podID="4ec108d1-5289-41b1-964f-e7a95ba88aa3" containerID="f94a4579857beea7d50087dc868645ccb76abb700822526759ea1afb76fbc335" exitCode=0 Apr 21 15:05:38.486101 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.486033 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-j7j88" Apr 21 15:05:38.486101 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.486088 2576 scope.go:117] "RemoveContainer" containerID="f94a4579857beea7d50087dc868645ccb76abb700822526759ea1afb76fbc335" Apr 21 15:05:38.487680 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:05:38.487662 2576 kuberuntime_manager.go:623] "Missing actuated resource record" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj" container="manager" Apr 21 15:05:38.495731 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.495701 2576 scope.go:117] "RemoveContainer" containerID="f94a4579857beea7d50087dc868645ccb76abb700822526759ea1afb76fbc335" Apr 21 15:05:38.495994 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:05:38.495976 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f94a4579857beea7d50087dc868645ccb76abb700822526759ea1afb76fbc335\": container with ID starting with f94a4579857beea7d50087dc868645ccb76abb700822526759ea1afb76fbc335 not found: ID does not exist" containerID="f94a4579857beea7d50087dc868645ccb76abb700822526759ea1afb76fbc335" Apr 21 15:05:38.496046 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.496002 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f94a4579857beea7d50087dc868645ccb76abb700822526759ea1afb76fbc335"} err="failed to get container status \"f94a4579857beea7d50087dc868645ccb76abb700822526759ea1afb76fbc335\": rpc error: code = NotFound desc = could not find container \"f94a4579857beea7d50087dc868645ccb76abb700822526759ea1afb76fbc335\": container with ID starting with f94a4579857beea7d50087dc868645ccb76abb700822526759ea1afb76fbc335 not found: ID does not exist" Apr 21 15:05:38.551718 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.551618 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zbj7\" (UniqueName: \"kubernetes.io/projected/4ec108d1-5289-41b1-964f-e7a95ba88aa3-kube-api-access-4zbj7\") pod \"4ec108d1-5289-41b1-964f-e7a95ba88aa3\" (UID: \"4ec108d1-5289-41b1-964f-e7a95ba88aa3\") " Apr 21 15:05:38.551893 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.551741 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4ec108d1-5289-41b1-964f-e7a95ba88aa3-extensions-socket-volume\") pod \"4ec108d1-5289-41b1-964f-e7a95ba88aa3\" (UID: \"4ec108d1-5289-41b1-964f-e7a95ba88aa3\") " Apr 21 15:05:38.552299 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.552229 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ec108d1-5289-41b1-964f-e7a95ba88aa3-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "4ec108d1-5289-41b1-964f-e7a95ba88aa3" (UID: "4ec108d1-5289-41b1-964f-e7a95ba88aa3"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:05:38.554048 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.554023 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ec108d1-5289-41b1-964f-e7a95ba88aa3-kube-api-access-4zbj7" (OuterVolumeSpecName: "kube-api-access-4zbj7") pod "4ec108d1-5289-41b1-964f-e7a95ba88aa3" (UID: "4ec108d1-5289-41b1-964f-e7a95ba88aa3"). InnerVolumeSpecName "kube-api-access-4zbj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:05:38.620034 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.620004 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" Apr 21 15:05:38.629902 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.629845 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" Apr 21 15:05:38.653200 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.653165 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4ec108d1-5289-41b1-964f-e7a95ba88aa3-extensions-socket-volume\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:05:38.653200 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.653194 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4zbj7\" (UniqueName: \"kubernetes.io/projected/4ec108d1-5289-41b1-964f-e7a95ba88aa3-kube-api-access-4zbj7\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:05:38.767439 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.767402 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb"] Apr 21 15:05:38.770298 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:05:38.770273 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f24ce6_cf2f_400a_b8e1_bdd7b8495218.slice/crio-49c3371a2360107ed6174b5da3793f47e01a095fa68a97c40e1fd556afabaa83 WatchSource:0}: Error finding container 49c3371a2360107ed6174b5da3793f47e01a095fa68a97c40e1fd556afabaa83: Status 404 returned error can't find the container with id 49c3371a2360107ed6174b5da3793f47e01a095fa68a97c40e1fd556afabaa83 Apr 21 15:05:38.792866 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:38.792844 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q"] Apr 21 15:05:38.803485 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:05:38.803450 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6576a751_7f6f_47f4_a67d_f802e918f1db.slice/crio-3940dfa31b6a6dabeb1bcb8f55ff44edbc586ab864bd50396de00b661c7ab2b3 WatchSource:0}: Error finding container 3940dfa31b6a6dabeb1bcb8f55ff44edbc586ab864bd50396de00b661c7ab2b3: Status 404 returned error can't find the container with id 3940dfa31b6a6dabeb1bcb8f55ff44edbc586ab864bd50396de00b661c7ab2b3 Apr 21 15:05:39.493263 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.493196 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" event={"ID":"6576a751-7f6f-47f4-a67d-f802e918f1db","Type":"ContainerStarted","Data":"dcc21ddb75e9d194ca774d739fab86c9692e1caa18fc3face1fe63645e4e893e"} Apr 21 15:05:39.493263 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.493233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" event={"ID":"6576a751-7f6f-47f4-a67d-f802e918f1db","Type":"ContainerStarted","Data":"3940dfa31b6a6dabeb1bcb8f55ff44edbc586ab864bd50396de00b661c7ab2b3"} Apr 21 15:05:39.493600 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.493293 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" Apr 21 15:05:39.494679 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.494658 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" event={"ID":"17f24ce6-cf2f-400a-b8e1-bdd7b8495218","Type":"ContainerStarted","Data":"af7a84235e8c47e3fc1ec62b8b7e76a4a34f56ea73ca8d425bda92d777ac0c69"} Apr 21 15:05:39.494797 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.494686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" event={"ID":"17f24ce6-cf2f-400a-b8e1-bdd7b8495218","Type":"ContainerStarted","Data":"49c3371a2360107ed6174b5da3793f47e01a095fa68a97c40e1fd556afabaa83"} Apr 21 15:05:39.494797 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.494765 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" Apr 21 15:05:39.496979 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.496922 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj" podUID="1d21af0f-bcbb-42bc-8365-c8708c699049" containerName="manager" containerID="cri-o://6aeb579aad541524d737c20685b06ad97eda0891f2f614ff3f43be012ec29353" gracePeriod=2 Apr 21 15:05:39.516135 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.516077 2576 status_manager.go:895] "Failed to get status for pod" podUID="1d21af0f-bcbb-42bc-8365-c8708c699049" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj" err="pods \"kuadrant-operator-controller-manager-84b657d985-9zqpj\" is forbidden: User \"system:node:ip-10-0-130-121.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-121.ec2.internal' and this object" Apr 21 15:05:39.519014 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.518974 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" podStartSLOduration=1.518962956 podStartE2EDuration="1.518962956s" podCreationTimestamp="2026-04-21 15:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:05:39.514505225 +0000 UTC m=+585.649482834" watchObservedRunningTime="2026-04-21 15:05:39.518962956 +0000 UTC m=+585.653940638" Apr 21 15:05:39.546772 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.546734 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" podStartSLOduration=1.546713888 podStartE2EDuration="1.546713888s" podCreationTimestamp="2026-04-21 15:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:05:39.542879603 +0000 UTC m=+585.677857212" watchObservedRunningTime="2026-04-21 15:05:39.546713888 +0000 UTC m=+585.681691496" Apr 21 15:05:39.740283 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.740260 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj" Apr 21 15:05:39.742249 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.742210 2576 status_manager.go:895] "Failed to get status for pod" podUID="1d21af0f-bcbb-42bc-8365-c8708c699049" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj" err="pods \"kuadrant-operator-controller-manager-84b657d985-9zqpj\" is forbidden: User \"system:node:ip-10-0-130-121.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-121.ec2.internal' and this object" Apr 21 15:05:39.761737 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.761711 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trb55\" (UniqueName: \"kubernetes.io/projected/1d21af0f-bcbb-42bc-8365-c8708c699049-kube-api-access-trb55\") pod \"1d21af0f-bcbb-42bc-8365-c8708c699049\" (UID: \"1d21af0f-bcbb-42bc-8365-c8708c699049\") " Apr 21 15:05:39.761877 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.761749 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1d21af0f-bcbb-42bc-8365-c8708c699049-extensions-socket-volume\") pod \"1d21af0f-bcbb-42bc-8365-c8708c699049\" (UID: \"1d21af0f-bcbb-42bc-8365-c8708c699049\") " Apr 21 15:05:39.762051 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.762024 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d21af0f-bcbb-42bc-8365-c8708c699049-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "1d21af0f-bcbb-42bc-8365-c8708c699049" (UID: "1d21af0f-bcbb-42bc-8365-c8708c699049"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:05:39.763876 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.763854 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d21af0f-bcbb-42bc-8365-c8708c699049-kube-api-access-trb55" (OuterVolumeSpecName: "kube-api-access-trb55") pod "1d21af0f-bcbb-42bc-8365-c8708c699049" (UID: "1d21af0f-bcbb-42bc-8365-c8708c699049"). InnerVolumeSpecName "kube-api-access-trb55". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:05:39.862399 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.862370 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trb55\" (UniqueName: \"kubernetes.io/projected/1d21af0f-bcbb-42bc-8365-c8708c699049-kube-api-access-trb55\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:05:39.862399 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:39.862395 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1d21af0f-bcbb-42bc-8365-c8708c699049-extensions-socket-volume\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:05:40.453180 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:40.453141 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d21af0f-bcbb-42bc-8365-c8708c699049" path="/var/lib/kubelet/pods/1d21af0f-bcbb-42bc-8365-c8708c699049/volumes" Apr 21 15:05:40.453626 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:40.453610 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ec108d1-5289-41b1-964f-e7a95ba88aa3" path="/var/lib/kubelet/pods/4ec108d1-5289-41b1-964f-e7a95ba88aa3/volumes" Apr 21 15:05:40.501643 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:40.501605 2576 generic.go:358] "Generic (PLEG): container finished" podID="1d21af0f-bcbb-42bc-8365-c8708c699049" containerID="6aeb579aad541524d737c20685b06ad97eda0891f2f614ff3f43be012ec29353" exitCode=2 Apr 21 15:05:40.501828 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:40.501650 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj" Apr 21 15:05:40.501828 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:40.501695 2576 scope.go:117] "RemoveContainer" containerID="6aeb579aad541524d737c20685b06ad97eda0891f2f614ff3f43be012ec29353" Apr 21 15:05:40.505728 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:40.505703 2576 status_manager.go:895] "Failed to get status for pod" podUID="1d21af0f-bcbb-42bc-8365-c8708c699049" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-9zqpj" err="pods \"kuadrant-operator-controller-manager-84b657d985-9zqpj\" is forbidden: User \"system:node:ip-10-0-130-121.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-121.ec2.internal' and this object" Apr 21 15:05:40.510315 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:40.510291 2576 scope.go:117] "RemoveContainer" containerID="6aeb579aad541524d737c20685b06ad97eda0891f2f614ff3f43be012ec29353" Apr 21 15:05:40.510557 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:05:40.510537 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aeb579aad541524d737c20685b06ad97eda0891f2f614ff3f43be012ec29353\": container with ID starting with 6aeb579aad541524d737c20685b06ad97eda0891f2f614ff3f43be012ec29353 not found: ID does not exist" containerID="6aeb579aad541524d737c20685b06ad97eda0891f2f614ff3f43be012ec29353" Apr 21 15:05:40.510627 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:40.510566 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aeb579aad541524d737c20685b06ad97eda0891f2f614ff3f43be012ec29353"} err="failed to get container status \"6aeb579aad541524d737c20685b06ad97eda0891f2f614ff3f43be012ec29353\": rpc error: code = NotFound desc = could not find container \"6aeb579aad541524d737c20685b06ad97eda0891f2f614ff3f43be012ec29353\": container with ID starting with 6aeb579aad541524d737c20685b06ad97eda0891f2f614ff3f43be012ec29353 not found: ID does not exist" Apr 21 15:05:43.529044 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.529009 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56c8dc547d-jl8h8" podUID="1c059704-159b-4a52-90e0-6e3ea29cb80e" containerName="console" containerID="cri-o://ab189d63021a1c13b4e269ca1fcdebd498bcbe54006b2125a65592dad7cce81e" gracePeriod=15 Apr 21 15:05:43.777275 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.777227 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56c8dc547d-jl8h8_1c059704-159b-4a52-90e0-6e3ea29cb80e/console/0.log" Apr 21 15:05:43.777436 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.777313 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:05:43.796607 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.796531 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-oauth-serving-cert\") pod \"1c059704-159b-4a52-90e0-6e3ea29cb80e\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " Apr 21 15:05:43.796721 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.796610 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-config\") pod \"1c059704-159b-4a52-90e0-6e3ea29cb80e\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " Apr 21 15:05:43.796721 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.796633 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-oauth-config\") pod \"1c059704-159b-4a52-90e0-6e3ea29cb80e\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " Apr 21 15:05:43.796721 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.796678 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-trusted-ca-bundle\") pod \"1c059704-159b-4a52-90e0-6e3ea29cb80e\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " Apr 21 15:05:43.797586 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.797078 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-serving-cert\") pod \"1c059704-159b-4a52-90e0-6e3ea29cb80e\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " Apr 21 15:05:43.797586 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.797119 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-service-ca\") pod \"1c059704-159b-4a52-90e0-6e3ea29cb80e\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " Apr 21 15:05:43.797586 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.797145 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwdns\" (UniqueName: \"kubernetes.io/projected/1c059704-159b-4a52-90e0-6e3ea29cb80e-kube-api-access-vwdns\") pod \"1c059704-159b-4a52-90e0-6e3ea29cb80e\" (UID: \"1c059704-159b-4a52-90e0-6e3ea29cb80e\") " Apr 21 15:05:43.797586 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.797359 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-config" (OuterVolumeSpecName: "console-config") pod "1c059704-159b-4a52-90e0-6e3ea29cb80e" (UID: "1c059704-159b-4a52-90e0-6e3ea29cb80e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:05:43.797586 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.797527 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-config\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:05:43.797875 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.797683 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1c059704-159b-4a52-90e0-6e3ea29cb80e" (UID: "1c059704-159b-4a52-90e0-6e3ea29cb80e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:05:43.797875 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.797818 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1c059704-159b-4a52-90e0-6e3ea29cb80e" (UID: "1c059704-159b-4a52-90e0-6e3ea29cb80e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:05:43.797987 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.797927 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-service-ca" (OuterVolumeSpecName: "service-ca") pod "1c059704-159b-4a52-90e0-6e3ea29cb80e" (UID: "1c059704-159b-4a52-90e0-6e3ea29cb80e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:05:43.800314 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.800280 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1c059704-159b-4a52-90e0-6e3ea29cb80e" (UID: "1c059704-159b-4a52-90e0-6e3ea29cb80e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:05:43.800526 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.800496 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c059704-159b-4a52-90e0-6e3ea29cb80e-kube-api-access-vwdns" (OuterVolumeSpecName: "kube-api-access-vwdns") pod "1c059704-159b-4a52-90e0-6e3ea29cb80e" (UID: "1c059704-159b-4a52-90e0-6e3ea29cb80e"). InnerVolumeSpecName "kube-api-access-vwdns". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:05:43.800616 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.800592 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1c059704-159b-4a52-90e0-6e3ea29cb80e" (UID: "1c059704-159b-4a52-90e0-6e3ea29cb80e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:05:43.898902 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.898861 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vwdns\" (UniqueName: \"kubernetes.io/projected/1c059704-159b-4a52-90e0-6e3ea29cb80e-kube-api-access-vwdns\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:05:43.898902 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.898895 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-oauth-serving-cert\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:05:43.898902 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.898905 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-oauth-config\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:05:43.899138 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.898914 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-trusted-ca-bundle\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:05:43.899138 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.898925 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c059704-159b-4a52-90e0-6e3ea29cb80e-console-serving-cert\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:05:43.899138 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:43.898934 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c059704-159b-4a52-90e0-6e3ea29cb80e-service-ca\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:05:44.519932 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:44.519905 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56c8dc547d-jl8h8_1c059704-159b-4a52-90e0-6e3ea29cb80e/console/0.log" Apr 21 15:05:44.520105 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:44.519943 2576 generic.go:358] "Generic (PLEG): container finished" podID="1c059704-159b-4a52-90e0-6e3ea29cb80e" containerID="ab189d63021a1c13b4e269ca1fcdebd498bcbe54006b2125a65592dad7cce81e" exitCode=2 Apr 21 15:05:44.520105 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:44.519975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56c8dc547d-jl8h8" event={"ID":"1c059704-159b-4a52-90e0-6e3ea29cb80e","Type":"ContainerDied","Data":"ab189d63021a1c13b4e269ca1fcdebd498bcbe54006b2125a65592dad7cce81e"} Apr 21 15:05:44.520105 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:44.519996 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56c8dc547d-jl8h8" event={"ID":"1c059704-159b-4a52-90e0-6e3ea29cb80e","Type":"ContainerDied","Data":"36c7034230e460e47679adac9719e1a6619c0f96a84d7e7465ce1656b46e6ab9"} Apr 21 15:05:44.520105 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:44.520011 2576 scope.go:117] "RemoveContainer" containerID="ab189d63021a1c13b4e269ca1fcdebd498bcbe54006b2125a65592dad7cce81e" Apr 21 15:05:44.520105 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:44.520021 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56c8dc547d-jl8h8" Apr 21 15:05:44.528583 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:44.528565 2576 scope.go:117] "RemoveContainer" containerID="ab189d63021a1c13b4e269ca1fcdebd498bcbe54006b2125a65592dad7cce81e" Apr 21 15:05:44.528806 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:05:44.528788 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab189d63021a1c13b4e269ca1fcdebd498bcbe54006b2125a65592dad7cce81e\": container with ID starting with ab189d63021a1c13b4e269ca1fcdebd498bcbe54006b2125a65592dad7cce81e not found: ID does not exist" containerID="ab189d63021a1c13b4e269ca1fcdebd498bcbe54006b2125a65592dad7cce81e" Apr 21 15:05:44.528859 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:44.528813 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab189d63021a1c13b4e269ca1fcdebd498bcbe54006b2125a65592dad7cce81e"} err="failed to get container status \"ab189d63021a1c13b4e269ca1fcdebd498bcbe54006b2125a65592dad7cce81e\": rpc error: code = NotFound desc = could not find container \"ab189d63021a1c13b4e269ca1fcdebd498bcbe54006b2125a65592dad7cce81e\": container with ID starting with ab189d63021a1c13b4e269ca1fcdebd498bcbe54006b2125a65592dad7cce81e not found: ID does not exist" Apr 21 15:05:44.537505 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:44.537484 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56c8dc547d-jl8h8"] Apr 21 15:05:44.542251 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:44.542215 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56c8dc547d-jl8h8"] Apr 21 15:05:46.451270 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:46.451221 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c059704-159b-4a52-90e0-6e3ea29cb80e" path="/var/lib/kubelet/pods/1c059704-159b-4a52-90e0-6e3ea29cb80e/volumes" Apr 21 15:05:50.504305 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:50.504276 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" Apr 21 15:05:50.504739 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:50.504680 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" Apr 21 15:05:50.610016 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:50.609984 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb"] Apr 21 15:05:50.610223 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:50.610199 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" podUID="17f24ce6-cf2f-400a-b8e1-bdd7b8495218" containerName="manager" containerID="cri-o://af7a84235e8c47e3fc1ec62b8b7e76a4a34f56ea73ca8d425bda92d777ac0c69" gracePeriod=10 Apr 21 15:05:50.856287 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:50.856266 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" Apr 21 15:05:50.961604 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:50.961572 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56zvq\" (UniqueName: \"kubernetes.io/projected/17f24ce6-cf2f-400a-b8e1-bdd7b8495218-kube-api-access-56zvq\") pod \"17f24ce6-cf2f-400a-b8e1-bdd7b8495218\" (UID: \"17f24ce6-cf2f-400a-b8e1-bdd7b8495218\") " Apr 21 15:05:50.961769 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:50.961677 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/17f24ce6-cf2f-400a-b8e1-bdd7b8495218-extensions-socket-volume\") pod \"17f24ce6-cf2f-400a-b8e1-bdd7b8495218\" (UID: \"17f24ce6-cf2f-400a-b8e1-bdd7b8495218\") " Apr 21 15:05:50.962036 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:50.962009 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17f24ce6-cf2f-400a-b8e1-bdd7b8495218-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "17f24ce6-cf2f-400a-b8e1-bdd7b8495218" (UID: "17f24ce6-cf2f-400a-b8e1-bdd7b8495218"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:05:50.963743 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:50.963722 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17f24ce6-cf2f-400a-b8e1-bdd7b8495218-kube-api-access-56zvq" (OuterVolumeSpecName: "kube-api-access-56zvq") pod "17f24ce6-cf2f-400a-b8e1-bdd7b8495218" (UID: "17f24ce6-cf2f-400a-b8e1-bdd7b8495218"). InnerVolumeSpecName "kube-api-access-56zvq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:05:50.999796 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:50.999766 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5"] Apr 21 15:05:51.000163 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.000150 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c059704-159b-4a52-90e0-6e3ea29cb80e" containerName="console" Apr 21 15:05:51.000207 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.000166 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c059704-159b-4a52-90e0-6e3ea29cb80e" containerName="console" Apr 21 15:05:51.000207 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.000183 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17f24ce6-cf2f-400a-b8e1-bdd7b8495218" containerName="manager" Apr 21 15:05:51.000207 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.000189 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f24ce6-cf2f-400a-b8e1-bdd7b8495218" containerName="manager" Apr 21 15:05:51.000329 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.000264 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c059704-159b-4a52-90e0-6e3ea29cb80e" containerName="console" Apr 21 15:05:51.000329 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.000276 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="17f24ce6-cf2f-400a-b8e1-bdd7b8495218" containerName="manager" Apr 21 15:05:51.002469 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.002452 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" Apr 21 15:05:51.020576 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.020549 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5"] Apr 21 15:05:51.062900 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.062804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/51a8e561-a655-48b2-8ed6-57efb52a742b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-dkhw5\" (UID: \"51a8e561-a655-48b2-8ed6-57efb52a742b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" Apr 21 15:05:51.063062 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.062922 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxwfh\" (UniqueName: \"kubernetes.io/projected/51a8e561-a655-48b2-8ed6-57efb52a742b-kube-api-access-vxwfh\") pod \"kuadrant-operator-controller-manager-55c7f4c975-dkhw5\" (UID: \"51a8e561-a655-48b2-8ed6-57efb52a742b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" Apr 21 15:05:51.063062 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.063023 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-56zvq\" (UniqueName: \"kubernetes.io/projected/17f24ce6-cf2f-400a-b8e1-bdd7b8495218-kube-api-access-56zvq\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:05:51.063062 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.063034 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/17f24ce6-cf2f-400a-b8e1-bdd7b8495218-extensions-socket-volume\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:05:51.163924 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.163886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/51a8e561-a655-48b2-8ed6-57efb52a742b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-dkhw5\" (UID: \"51a8e561-a655-48b2-8ed6-57efb52a742b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" Apr 21 15:05:51.164078 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.163962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxwfh\" (UniqueName: \"kubernetes.io/projected/51a8e561-a655-48b2-8ed6-57efb52a742b-kube-api-access-vxwfh\") pod \"kuadrant-operator-controller-manager-55c7f4c975-dkhw5\" (UID: \"51a8e561-a655-48b2-8ed6-57efb52a742b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" Apr 21 15:05:51.164304 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.164283 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/51a8e561-a655-48b2-8ed6-57efb52a742b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-dkhw5\" (UID: \"51a8e561-a655-48b2-8ed6-57efb52a742b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" Apr 21 15:05:51.173457 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.173432 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxwfh\" (UniqueName: \"kubernetes.io/projected/51a8e561-a655-48b2-8ed6-57efb52a742b-kube-api-access-vxwfh\") pod \"kuadrant-operator-controller-manager-55c7f4c975-dkhw5\" (UID: \"51a8e561-a655-48b2-8ed6-57efb52a742b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" Apr 21 15:05:51.313126 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.313044 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" Apr 21 15:05:51.463202 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.463143 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5"] Apr 21 15:05:51.465609 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:05:51.465584 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a8e561_a655_48b2_8ed6_57efb52a742b.slice/crio-4e9a0fe8c50b42759f866dd20a04a886c1e8b352c96a711ec40901183976236b WatchSource:0}: Error finding container 4e9a0fe8c50b42759f866dd20a04a886c1e8b352c96a711ec40901183976236b: Status 404 returned error can't find the container with id 4e9a0fe8c50b42759f866dd20a04a886c1e8b352c96a711ec40901183976236b Apr 21 15:05:51.550969 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.550933 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" event={"ID":"51a8e561-a655-48b2-8ed6-57efb52a742b","Type":"ContainerStarted","Data":"5cfc2a871c3b1323282cbb3926c3d28566767baa2aa09c61943a76b42dd4188d"} Apr 21 15:05:51.551394 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.550981 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" Apr 21 15:05:51.551394 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.550998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" event={"ID":"51a8e561-a655-48b2-8ed6-57efb52a742b","Type":"ContainerStarted","Data":"4e9a0fe8c50b42759f866dd20a04a886c1e8b352c96a711ec40901183976236b"} Apr 21 15:05:51.552086 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.552061 2576 generic.go:358] "Generic (PLEG): container finished" podID="17f24ce6-cf2f-400a-b8e1-bdd7b8495218" containerID="af7a84235e8c47e3fc1ec62b8b7e76a4a34f56ea73ca8d425bda92d777ac0c69" exitCode=0 Apr 21 15:05:51.552135 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.552123 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" Apr 21 15:05:51.552216 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.552124 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" event={"ID":"17f24ce6-cf2f-400a-b8e1-bdd7b8495218","Type":"ContainerDied","Data":"af7a84235e8c47e3fc1ec62b8b7e76a4a34f56ea73ca8d425bda92d777ac0c69"} Apr 21 15:05:51.552301 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.552230 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb" event={"ID":"17f24ce6-cf2f-400a-b8e1-bdd7b8495218","Type":"ContainerDied","Data":"49c3371a2360107ed6174b5da3793f47e01a095fa68a97c40e1fd556afabaa83"} Apr 21 15:05:51.552301 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.552269 2576 scope.go:117] "RemoveContainer" containerID="af7a84235e8c47e3fc1ec62b8b7e76a4a34f56ea73ca8d425bda92d777ac0c69" Apr 21 15:05:51.561062 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.561045 2576 scope.go:117] "RemoveContainer" containerID="af7a84235e8c47e3fc1ec62b8b7e76a4a34f56ea73ca8d425bda92d777ac0c69" Apr 21 15:05:51.561352 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:05:51.561334 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af7a84235e8c47e3fc1ec62b8b7e76a4a34f56ea73ca8d425bda92d777ac0c69\": container with ID starting with af7a84235e8c47e3fc1ec62b8b7e76a4a34f56ea73ca8d425bda92d777ac0c69 not found: ID does not exist" containerID="af7a84235e8c47e3fc1ec62b8b7e76a4a34f56ea73ca8d425bda92d777ac0c69" Apr 21 15:05:51.561406 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.561359 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af7a84235e8c47e3fc1ec62b8b7e76a4a34f56ea73ca8d425bda92d777ac0c69"} err="failed to get container status \"af7a84235e8c47e3fc1ec62b8b7e76a4a34f56ea73ca8d425bda92d777ac0c69\": rpc error: code = NotFound desc = could not find container \"af7a84235e8c47e3fc1ec62b8b7e76a4a34f56ea73ca8d425bda92d777ac0c69\": container with ID starting with af7a84235e8c47e3fc1ec62b8b7e76a4a34f56ea73ca8d425bda92d777ac0c69 not found: ID does not exist" Apr 21 15:05:51.577065 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.576985 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" podStartSLOduration=1.576969974 podStartE2EDuration="1.576969974s" podCreationTimestamp="2026-04-21 15:05:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:05:51.572901559 +0000 UTC m=+597.707879168" watchObservedRunningTime="2026-04-21 15:05:51.576969974 +0000 UTC m=+597.711947583" Apr 21 15:05:51.593637 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.593610 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb"] Apr 21 15:05:51.603820 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:51.603790 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-b97tb"] Apr 21 15:05:52.451852 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:05:52.451811 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17f24ce6-cf2f-400a-b8e1-bdd7b8495218" path="/var/lib/kubelet/pods/17f24ce6-cf2f-400a-b8e1-bdd7b8495218/volumes" Apr 21 15:06:02.559583 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:02.559549 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" Apr 21 15:06:02.610139 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:02.610101 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q"] Apr 21 15:06:02.610407 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:02.610386 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" podUID="6576a751-7f6f-47f4-a67d-f802e918f1db" containerName="manager" containerID="cri-o://dcc21ddb75e9d194ca774d739fab86c9692e1caa18fc3face1fe63645e4e893e" gracePeriod=10 Apr 21 15:06:02.851564 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:02.851538 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" Apr 21 15:06:02.975150 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:02.975110 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6576a751-7f6f-47f4-a67d-f802e918f1db-extensions-socket-volume\") pod \"6576a751-7f6f-47f4-a67d-f802e918f1db\" (UID: \"6576a751-7f6f-47f4-a67d-f802e918f1db\") " Apr 21 15:06:02.975360 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:02.975178 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js9n8\" (UniqueName: \"kubernetes.io/projected/6576a751-7f6f-47f4-a67d-f802e918f1db-kube-api-access-js9n8\") pod \"6576a751-7f6f-47f4-a67d-f802e918f1db\" (UID: \"6576a751-7f6f-47f4-a67d-f802e918f1db\") " Apr 21 15:06:02.975602 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:02.975562 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6576a751-7f6f-47f4-a67d-f802e918f1db-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "6576a751-7f6f-47f4-a67d-f802e918f1db" (UID: "6576a751-7f6f-47f4-a67d-f802e918f1db"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:06:02.977454 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:02.977426 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6576a751-7f6f-47f4-a67d-f802e918f1db-kube-api-access-js9n8" (OuterVolumeSpecName: "kube-api-access-js9n8") pod "6576a751-7f6f-47f4-a67d-f802e918f1db" (UID: "6576a751-7f6f-47f4-a67d-f802e918f1db"). InnerVolumeSpecName "kube-api-access-js9n8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:06:03.076458 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:03.076429 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-js9n8\" (UniqueName: \"kubernetes.io/projected/6576a751-7f6f-47f4-a67d-f802e918f1db-kube-api-access-js9n8\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:06:03.076458 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:03.076454 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6576a751-7f6f-47f4-a67d-f802e918f1db-extensions-socket-volume\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:06:03.597572 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:03.597532 2576 generic.go:358] "Generic (PLEG): container finished" podID="6576a751-7f6f-47f4-a67d-f802e918f1db" containerID="dcc21ddb75e9d194ca774d739fab86c9692e1caa18fc3face1fe63645e4e893e" exitCode=0 Apr 21 15:06:03.597994 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:03.597593 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" Apr 21 15:06:03.597994 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:03.597601 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" event={"ID":"6576a751-7f6f-47f4-a67d-f802e918f1db","Type":"ContainerDied","Data":"dcc21ddb75e9d194ca774d739fab86c9692e1caa18fc3face1fe63645e4e893e"} Apr 21 15:06:03.597994 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:03.597630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q" event={"ID":"6576a751-7f6f-47f4-a67d-f802e918f1db","Type":"ContainerDied","Data":"3940dfa31b6a6dabeb1bcb8f55ff44edbc586ab864bd50396de00b661c7ab2b3"} Apr 21 15:06:03.597994 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:03.597644 2576 scope.go:117] "RemoveContainer" containerID="dcc21ddb75e9d194ca774d739fab86c9692e1caa18fc3face1fe63645e4e893e" Apr 21 15:06:03.606506 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:03.606486 2576 scope.go:117] "RemoveContainer" containerID="dcc21ddb75e9d194ca774d739fab86c9692e1caa18fc3face1fe63645e4e893e" Apr 21 15:06:03.606787 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:06:03.606767 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcc21ddb75e9d194ca774d739fab86c9692e1caa18fc3face1fe63645e4e893e\": container with ID starting with dcc21ddb75e9d194ca774d739fab86c9692e1caa18fc3face1fe63645e4e893e not found: ID does not exist" containerID="dcc21ddb75e9d194ca774d739fab86c9692e1caa18fc3face1fe63645e4e893e" Apr 21 15:06:03.606833 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:03.606797 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcc21ddb75e9d194ca774d739fab86c9692e1caa18fc3face1fe63645e4e893e"} err="failed to get container status \"dcc21ddb75e9d194ca774d739fab86c9692e1caa18fc3face1fe63645e4e893e\": rpc error: code = NotFound desc = could not find container \"dcc21ddb75e9d194ca774d739fab86c9692e1caa18fc3face1fe63645e4e893e\": container with ID starting with dcc21ddb75e9d194ca774d739fab86c9692e1caa18fc3face1fe63645e4e893e not found: ID does not exist" Apr 21 15:06:03.618787 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:03.618759 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q"] Apr 21 15:06:03.625187 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:03.625159 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-6xf6q"] Apr 21 15:06:04.450968 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:04.450936 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6576a751-7f6f-47f4-a67d-f802e918f1db" path="/var/lib/kubelet/pods/6576a751-7f6f-47f4-a67d-f802e918f1db/volumes" Apr 21 15:06:06.921368 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:06.921335 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7"] Apr 21 15:06:06.921840 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:06.921824 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6576a751-7f6f-47f4-a67d-f802e918f1db" containerName="manager" Apr 21 15:06:06.921894 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:06.921842 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6576a751-7f6f-47f4-a67d-f802e918f1db" containerName="manager" Apr 21 15:06:06.921929 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:06.921917 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6576a751-7f6f-47f4-a67d-f802e918f1db" containerName="manager" Apr 21 15:06:06.925811 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:06.925779 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:06.932952 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:06.932927 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-qbk6d\"" Apr 21 15:06:06.952931 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:06.952900 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7"] Apr 21 15:06:07.014545 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.014510 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.014726 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.014551 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.014726 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.014573 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.014726 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.014641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hlqq\" (UniqueName: \"kubernetes.io/projected/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-kube-api-access-2hlqq\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.014726 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.014668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.014726 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.014686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.014934 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.014790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.014934 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.014856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.014934 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.014895 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.115458 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.115420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.115653 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.115487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.115653 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.115520 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.115653 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.115543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.115912 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.115876 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hlqq\" (UniqueName: \"kubernetes.io/projected/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-kube-api-access-2hlqq\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.116062 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.115918 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.116062 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.115884 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.116062 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.115936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.116062 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.115966 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.116062 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.116026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.116062 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.115964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.116062 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.116056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.116509 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.116327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.116566 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.116545 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.118287 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.118262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.118402 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.118384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.126265 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.126219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.126604 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.126584 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hlqq\" (UniqueName: \"kubernetes.io/projected/bb55fb0d-f2b6-48fd-8ecc-616afdccee2f-kube-api-access-2hlqq\") pod \"maas-default-gateway-openshift-default-845c6b4b48-489p7\" (UID: \"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.239356 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.239234 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:07.389867 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.389800 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7"] Apr 21 15:06:07.392188 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:06:07.392156 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb55fb0d_f2b6_48fd_8ecc_616afdccee2f.slice/crio-55ce3318e48eb5e5260b32501cf5ad541206613871b74dc51d12e57f59389310 WatchSource:0}: Error finding container 55ce3318e48eb5e5260b32501cf5ad541206613871b74dc51d12e57f59389310: Status 404 returned error can't find the container with id 55ce3318e48eb5e5260b32501cf5ad541206613871b74dc51d12e57f59389310 Apr 21 15:06:07.615355 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:07.615311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" event={"ID":"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f","Type":"ContainerStarted","Data":"55ce3318e48eb5e5260b32501cf5ad541206613871b74dc51d12e57f59389310"} Apr 21 15:06:10.511369 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:10.511332 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 21 15:06:10.511657 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:10.511402 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 21 15:06:10.511657 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:10.511438 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 21 15:06:10.629458 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:10.629422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" event={"ID":"bb55fb0d-f2b6-48fd-8ecc-616afdccee2f","Type":"ContainerStarted","Data":"55a5ec52d6e3ff206527405f1e3c8a87bee6a1d52294f104eb0ea115131b6f32"} Apr 21 15:06:10.654422 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:10.654352 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" podStartSLOduration=1.537454274 podStartE2EDuration="4.654335695s" podCreationTimestamp="2026-04-21 15:06:06 +0000 UTC" firstStartedPulling="2026-04-21 15:06:07.3941704 +0000 UTC m=+613.529148001" lastFinishedPulling="2026-04-21 15:06:10.511051835 +0000 UTC m=+616.646029422" observedRunningTime="2026-04-21 15:06:10.650928503 +0000 UTC m=+616.785906113" watchObservedRunningTime="2026-04-21 15:06:10.654335695 +0000 UTC m=+616.789313303" Apr 21 15:06:11.240396 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:11.240359 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:11.244914 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:11.244889 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:11.633891 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:11.633863 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:11.635057 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:11.635036 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-489p7" Apr 21 15:06:21.956318 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:21.956197 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:06:21.959227 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:21.959207 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-pl9jd" Apr 21 15:06:21.962212 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:21.962190 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-cdlh8\"" Apr 21 15:06:21.962767 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:21.962750 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 15:06:21.970336 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:21.970317 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:06:22.007383 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:22.007350 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:06:22.055047 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:22.055012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2152ab38-2fb4-4ff0-b163-a898eb6a3258-config-file\") pod \"limitador-limitador-78c99df468-pl9jd\" (UID: \"2152ab38-2fb4-4ff0-b163-a898eb6a3258\") " pod="kuadrant-system/limitador-limitador-78c99df468-pl9jd" Apr 21 15:06:22.055212 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:22.055102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk98c\" (UniqueName: \"kubernetes.io/projected/2152ab38-2fb4-4ff0-b163-a898eb6a3258-kube-api-access-nk98c\") pod \"limitador-limitador-78c99df468-pl9jd\" (UID: \"2152ab38-2fb4-4ff0-b163-a898eb6a3258\") " pod="kuadrant-system/limitador-limitador-78c99df468-pl9jd" Apr 21 15:06:22.155629 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:22.155594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2152ab38-2fb4-4ff0-b163-a898eb6a3258-config-file\") pod \"limitador-limitador-78c99df468-pl9jd\" (UID: \"2152ab38-2fb4-4ff0-b163-a898eb6a3258\") " pod="kuadrant-system/limitador-limitador-78c99df468-pl9jd" Apr 21 15:06:22.155788 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:22.155657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk98c\" (UniqueName: \"kubernetes.io/projected/2152ab38-2fb4-4ff0-b163-a898eb6a3258-kube-api-access-nk98c\") pod \"limitador-limitador-78c99df468-pl9jd\" (UID: \"2152ab38-2fb4-4ff0-b163-a898eb6a3258\") " pod="kuadrant-system/limitador-limitador-78c99df468-pl9jd" Apr 21 15:06:22.156460 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:22.156426 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2152ab38-2fb4-4ff0-b163-a898eb6a3258-config-file\") pod \"limitador-limitador-78c99df468-pl9jd\" (UID: \"2152ab38-2fb4-4ff0-b163-a898eb6a3258\") " pod="kuadrant-system/limitador-limitador-78c99df468-pl9jd" Apr 21 15:06:22.168805 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:22.168777 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk98c\" (UniqueName: \"kubernetes.io/projected/2152ab38-2fb4-4ff0-b163-a898eb6a3258-kube-api-access-nk98c\") pod \"limitador-limitador-78c99df468-pl9jd\" (UID: \"2152ab38-2fb4-4ff0-b163-a898eb6a3258\") " pod="kuadrant-system/limitador-limitador-78c99df468-pl9jd" Apr 21 15:06:22.269855 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:22.269820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-pl9jd" Apr 21 15:06:22.401713 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:22.401688 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:06:22.404658 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:06:22.404625 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2152ab38_2fb4_4ff0_b163_a898eb6a3258.slice/crio-1c316bbad5242c7679aa5ba9c2fe3f457cb60f09cd397772162bce5c3422a6f7 WatchSource:0}: Error finding container 1c316bbad5242c7679aa5ba9c2fe3f457cb60f09cd397772162bce5c3422a6f7: Status 404 returned error can't find the container with id 1c316bbad5242c7679aa5ba9c2fe3f457cb60f09cd397772162bce5c3422a6f7 Apr 21 15:06:22.678763 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:22.678665 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-pl9jd" event={"ID":"2152ab38-2fb4-4ff0-b163-a898eb6a3258","Type":"ContainerStarted","Data":"1c316bbad5242c7679aa5ba9c2fe3f457cb60f09cd397772162bce5c3422a6f7"} Apr 21 15:06:25.693061 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:25.693028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-pl9jd" event={"ID":"2152ab38-2fb4-4ff0-b163-a898eb6a3258","Type":"ContainerStarted","Data":"b5e06c518a9f0e5f0c6045eea37a251ee97b2d3ef6723b20020c5a01f28af90b"} Apr 21 15:06:25.693489 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:25.693150 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-pl9jd" Apr 21 15:06:25.711961 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:25.711915 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-pl9jd" podStartSLOduration=2.208705184 podStartE2EDuration="4.711901269s" podCreationTimestamp="2026-04-21 15:06:21 +0000 UTC" firstStartedPulling="2026-04-21 15:06:22.406413901 +0000 UTC m=+628.541391488" lastFinishedPulling="2026-04-21 15:06:24.909609986 +0000 UTC m=+631.044587573" observedRunningTime="2026-04-21 15:06:25.709560825 +0000 UTC m=+631.844538434" watchObservedRunningTime="2026-04-21 15:06:25.711901269 +0000 UTC m=+631.846878879" Apr 21 15:06:36.698842 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:06:36.698808 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-pl9jd" Apr 21 15:07:55.103606 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.103516 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-g2gkj"] Apr 21 15:07:55.106208 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.106189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-g2gkj" Apr 21 15:07:55.109373 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.109351 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-jm2k7\"" Apr 21 15:07:55.119019 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.118990 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-g2gkj"] Apr 21 15:07:55.202608 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.202575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brsqd\" (UniqueName: \"kubernetes.io/projected/205e368b-ebfa-453c-b9ac-1a90a2447afb-kube-api-access-brsqd\") pod \"maas-controller-6d4c8f55f9-g2gkj\" (UID: \"205e368b-ebfa-453c-b9ac-1a90a2447afb\") " pod="opendatahub/maas-controller-6d4c8f55f9-g2gkj" Apr 21 15:07:55.289880 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.289836 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-557546c96d-fkb5b"] Apr 21 15:07:55.292407 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.292383 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-557546c96d-fkb5b" Apr 21 15:07:55.303465 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.303433 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgr2c\" (UniqueName: \"kubernetes.io/projected/377c2b83-8460-4096-9778-879d61d1aa98-kube-api-access-mgr2c\") pod \"maas-controller-557546c96d-fkb5b\" (UID: \"377c2b83-8460-4096-9778-879d61d1aa98\") " pod="opendatahub/maas-controller-557546c96d-fkb5b" Apr 21 15:07:55.303681 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.303664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brsqd\" (UniqueName: \"kubernetes.io/projected/205e368b-ebfa-453c-b9ac-1a90a2447afb-kube-api-access-brsqd\") pod \"maas-controller-6d4c8f55f9-g2gkj\" (UID: \"205e368b-ebfa-453c-b9ac-1a90a2447afb\") " pod="opendatahub/maas-controller-6d4c8f55f9-g2gkj" Apr 21 15:07:55.306116 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.306091 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-557546c96d-fkb5b"] Apr 21 15:07:55.321287 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.321261 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brsqd\" (UniqueName: \"kubernetes.io/projected/205e368b-ebfa-453c-b9ac-1a90a2447afb-kube-api-access-brsqd\") pod \"maas-controller-6d4c8f55f9-g2gkj\" (UID: \"205e368b-ebfa-453c-b9ac-1a90a2447afb\") " pod="opendatahub/maas-controller-6d4c8f55f9-g2gkj" Apr 21 15:07:55.405082 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.404992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgr2c\" (UniqueName: \"kubernetes.io/projected/377c2b83-8460-4096-9778-879d61d1aa98-kube-api-access-mgr2c\") pod \"maas-controller-557546c96d-fkb5b\" (UID: \"377c2b83-8460-4096-9778-879d61d1aa98\") " pod="opendatahub/maas-controller-557546c96d-fkb5b" Apr 21 15:07:55.418422 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.418388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgr2c\" (UniqueName: \"kubernetes.io/projected/377c2b83-8460-4096-9778-879d61d1aa98-kube-api-access-mgr2c\") pod \"maas-controller-557546c96d-fkb5b\" (UID: \"377c2b83-8460-4096-9778-879d61d1aa98\") " pod="opendatahub/maas-controller-557546c96d-fkb5b" Apr 21 15:07:55.418568 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.418501 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-g2gkj" Apr 21 15:07:55.490406 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.490377 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-557546c96d-fkb5b"] Apr 21 15:07:55.490701 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.490687 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-557546c96d-fkb5b" Apr 21 15:07:55.573185 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.573156 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-g2gkj"] Apr 21 15:07:55.575166 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:07:55.575140 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod205e368b_ebfa_453c_b9ac_1a90a2447afb.slice/crio-b6736a3cd92f09d67f3b2ae3ba1b9968f5318471dc0e21d6d90346270c4c5174 WatchSource:0}: Error finding container b6736a3cd92f09d67f3b2ae3ba1b9968f5318471dc0e21d6d90346270c4c5174: Status 404 returned error can't find the container with id b6736a3cd92f09d67f3b2ae3ba1b9968f5318471dc0e21d6d90346270c4c5174 Apr 21 15:07:55.576525 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.576506 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:07:55.634394 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:55.634369 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-557546c96d-fkb5b"] Apr 21 15:07:55.637093 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:07:55.637065 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod377c2b83_8460_4096_9778_879d61d1aa98.slice/crio-2973776ca168ba934fcc2b84e43b1c5afd79dd233e04bb3df662b0dcdb8dcbce WatchSource:0}: Error finding container 2973776ca168ba934fcc2b84e43b1c5afd79dd233e04bb3df662b0dcdb8dcbce: Status 404 returned error can't find the container with id 2973776ca168ba934fcc2b84e43b1c5afd79dd233e04bb3df662b0dcdb8dcbce Apr 21 15:07:56.070551 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:56.070495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-557546c96d-fkb5b" event={"ID":"377c2b83-8460-4096-9778-879d61d1aa98","Type":"ContainerStarted","Data":"2973776ca168ba934fcc2b84e43b1c5afd79dd233e04bb3df662b0dcdb8dcbce"} Apr 21 15:07:56.072478 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:56.072428 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-g2gkj" event={"ID":"205e368b-ebfa-453c-b9ac-1a90a2447afb","Type":"ContainerStarted","Data":"b6736a3cd92f09d67f3b2ae3ba1b9968f5318471dc0e21d6d90346270c4c5174"} Apr 21 15:07:59.087153 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:59.087118 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-g2gkj" event={"ID":"205e368b-ebfa-453c-b9ac-1a90a2447afb","Type":"ContainerStarted","Data":"1054b996da01ee2de52627e3e1e5d93486e34304856975535c09aaaf840e35e2"} Apr 21 15:07:59.087611 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:59.087205 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-g2gkj" Apr 21 15:07:59.088492 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:59.088469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-557546c96d-fkb5b" event={"ID":"377c2b83-8460-4096-9778-879d61d1aa98","Type":"ContainerStarted","Data":"24d612fe6f96c387206b245a2433924da871dea40a761f58aad272c9d9bc54f7"} Apr 21 15:07:59.088592 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:59.088566 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-557546c96d-fkb5b" podUID="377c2b83-8460-4096-9778-879d61d1aa98" containerName="manager" containerID="cri-o://24d612fe6f96c387206b245a2433924da871dea40a761f58aad272c9d9bc54f7" gracePeriod=10 Apr 21 15:07:59.088648 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:59.088614 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-557546c96d-fkb5b" Apr 21 15:07:59.116594 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:59.116550 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-g2gkj" podStartSLOduration=1.036630069 podStartE2EDuration="4.116536717s" podCreationTimestamp="2026-04-21 15:07:55 +0000 UTC" firstStartedPulling="2026-04-21 15:07:55.576624829 +0000 UTC m=+721.711602416" lastFinishedPulling="2026-04-21 15:07:58.656531462 +0000 UTC m=+724.791509064" observedRunningTime="2026-04-21 15:07:59.115655871 +0000 UTC m=+725.250633481" watchObservedRunningTime="2026-04-21 15:07:59.116536717 +0000 UTC m=+725.251514325" Apr 21 15:07:59.171335 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:59.171278 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-557546c96d-fkb5b" podStartSLOduration=1.151135725 podStartE2EDuration="4.171262492s" podCreationTimestamp="2026-04-21 15:07:55 +0000 UTC" firstStartedPulling="2026-04-21 15:07:55.638453082 +0000 UTC m=+721.773430669" lastFinishedPulling="2026-04-21 15:07:58.658579836 +0000 UTC m=+724.793557436" observedRunningTime="2026-04-21 15:07:59.169362385 +0000 UTC m=+725.304339994" watchObservedRunningTime="2026-04-21 15:07:59.171262492 +0000 UTC m=+725.306240100" Apr 21 15:07:59.331583 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:59.331561 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-557546c96d-fkb5b" Apr 21 15:07:59.449011 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:59.448908 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgr2c\" (UniqueName: \"kubernetes.io/projected/377c2b83-8460-4096-9778-879d61d1aa98-kube-api-access-mgr2c\") pod \"377c2b83-8460-4096-9778-879d61d1aa98\" (UID: \"377c2b83-8460-4096-9778-879d61d1aa98\") " Apr 21 15:07:59.451274 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:59.451249 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/377c2b83-8460-4096-9778-879d61d1aa98-kube-api-access-mgr2c" (OuterVolumeSpecName: "kube-api-access-mgr2c") pod "377c2b83-8460-4096-9778-879d61d1aa98" (UID: "377c2b83-8460-4096-9778-879d61d1aa98"). InnerVolumeSpecName "kube-api-access-mgr2c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:07:59.549756 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:07:59.549718 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mgr2c\" (UniqueName: \"kubernetes.io/projected/377c2b83-8460-4096-9778-879d61d1aa98-kube-api-access-mgr2c\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:08:00.095776 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.095739 2576 generic.go:358] "Generic (PLEG): container finished" podID="377c2b83-8460-4096-9778-879d61d1aa98" containerID="24d612fe6f96c387206b245a2433924da871dea40a761f58aad272c9d9bc54f7" exitCode=0 Apr 21 15:08:00.096233 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.095800 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-557546c96d-fkb5b" Apr 21 15:08:00.096233 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.095830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-557546c96d-fkb5b" event={"ID":"377c2b83-8460-4096-9778-879d61d1aa98","Type":"ContainerDied","Data":"24d612fe6f96c387206b245a2433924da871dea40a761f58aad272c9d9bc54f7"} Apr 21 15:08:00.096233 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.095874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-557546c96d-fkb5b" event={"ID":"377c2b83-8460-4096-9778-879d61d1aa98","Type":"ContainerDied","Data":"2973776ca168ba934fcc2b84e43b1c5afd79dd233e04bb3df662b0dcdb8dcbce"} Apr 21 15:08:00.096233 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.095897 2576 scope.go:117] "RemoveContainer" containerID="24d612fe6f96c387206b245a2433924da871dea40a761f58aad272c9d9bc54f7" Apr 21 15:08:00.109988 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.109969 2576 scope.go:117] "RemoveContainer" containerID="24d612fe6f96c387206b245a2433924da871dea40a761f58aad272c9d9bc54f7" Apr 21 15:08:00.110298 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:08:00.110279 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24d612fe6f96c387206b245a2433924da871dea40a761f58aad272c9d9bc54f7\": container with ID starting with 24d612fe6f96c387206b245a2433924da871dea40a761f58aad272c9d9bc54f7 not found: ID does not exist" containerID="24d612fe6f96c387206b245a2433924da871dea40a761f58aad272c9d9bc54f7" Apr 21 15:08:00.110343 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.110308 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24d612fe6f96c387206b245a2433924da871dea40a761f58aad272c9d9bc54f7"} err="failed to get container status \"24d612fe6f96c387206b245a2433924da871dea40a761f58aad272c9d9bc54f7\": rpc error: code = NotFound desc = could not find container \"24d612fe6f96c387206b245a2433924da871dea40a761f58aad272c9d9bc54f7\": container with ID starting with 24d612fe6f96c387206b245a2433924da871dea40a761f58aad272c9d9bc54f7 not found: ID does not exist" Apr 21 15:08:00.129675 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.129646 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-557546c96d-fkb5b"] Apr 21 15:08:00.137011 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.136989 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-557546c96d-fkb5b"] Apr 21 15:08:00.452726 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.452643 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="377c2b83-8460-4096-9778-879d61d1aa98" path="/var/lib/kubelet/pods/377c2b83-8460-4096-9778-879d61d1aa98/volumes" Apr 21 15:08:00.495121 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.495080 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-f4f6d6d76-gvtk6"] Apr 21 15:08:00.495747 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.495731 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="377c2b83-8460-4096-9778-879d61d1aa98" containerName="manager" Apr 21 15:08:00.495791 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.495751 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="377c2b83-8460-4096-9778-879d61d1aa98" containerName="manager" Apr 21 15:08:00.495863 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.495853 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="377c2b83-8460-4096-9778-879d61d1aa98" containerName="manager" Apr 21 15:08:00.499516 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.499495 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-f4f6d6d76-gvtk6" Apr 21 15:08:00.501851 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.501829 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 21 15:08:00.502431 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.502414 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 21 15:08:00.502857 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.502833 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-xx77w\"" Apr 21 15:08:00.523930 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.523907 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-f4f6d6d76-gvtk6"] Apr 21 15:08:00.559375 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.559345 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72wm2\" (UniqueName: \"kubernetes.io/projected/655fd2c5-bba3-477c-b11b-1515bc56c276-kube-api-access-72wm2\") pod \"maas-api-f4f6d6d76-gvtk6\" (UID: \"655fd2c5-bba3-477c-b11b-1515bc56c276\") " pod="opendatahub/maas-api-f4f6d6d76-gvtk6" Apr 21 15:08:00.559550 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.559427 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/655fd2c5-bba3-477c-b11b-1515bc56c276-maas-api-tls\") pod \"maas-api-f4f6d6d76-gvtk6\" (UID: \"655fd2c5-bba3-477c-b11b-1515bc56c276\") " pod="opendatahub/maas-api-f4f6d6d76-gvtk6" Apr 21 15:08:00.660614 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.660584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/655fd2c5-bba3-477c-b11b-1515bc56c276-maas-api-tls\") pod \"maas-api-f4f6d6d76-gvtk6\" (UID: \"655fd2c5-bba3-477c-b11b-1515bc56c276\") " pod="opendatahub/maas-api-f4f6d6d76-gvtk6" Apr 21 15:08:00.660779 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.660684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72wm2\" (UniqueName: \"kubernetes.io/projected/655fd2c5-bba3-477c-b11b-1515bc56c276-kube-api-access-72wm2\") pod \"maas-api-f4f6d6d76-gvtk6\" (UID: \"655fd2c5-bba3-477c-b11b-1515bc56c276\") " pod="opendatahub/maas-api-f4f6d6d76-gvtk6" Apr 21 15:08:00.663228 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.663203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/655fd2c5-bba3-477c-b11b-1515bc56c276-maas-api-tls\") pod \"maas-api-f4f6d6d76-gvtk6\" (UID: \"655fd2c5-bba3-477c-b11b-1515bc56c276\") " pod="opendatahub/maas-api-f4f6d6d76-gvtk6" Apr 21 15:08:00.674430 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.674404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72wm2\" (UniqueName: \"kubernetes.io/projected/655fd2c5-bba3-477c-b11b-1515bc56c276-kube-api-access-72wm2\") pod \"maas-api-f4f6d6d76-gvtk6\" (UID: \"655fd2c5-bba3-477c-b11b-1515bc56c276\") " pod="opendatahub/maas-api-f4f6d6d76-gvtk6" Apr 21 15:08:00.812216 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.812178 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-f4f6d6d76-gvtk6" Apr 21 15:08:00.957591 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:00.957526 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-f4f6d6d76-gvtk6"] Apr 21 15:08:00.959991 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:08:00.959962 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod655fd2c5_bba3_477c_b11b_1515bc56c276.slice/crio-ca6953c57b208540ad4da0beeca6c1ef2c3ad2c30313ad15176dc36d9c1d3246 WatchSource:0}: Error finding container ca6953c57b208540ad4da0beeca6c1ef2c3ad2c30313ad15176dc36d9c1d3246: Status 404 returned error can't find the container with id ca6953c57b208540ad4da0beeca6c1ef2c3ad2c30313ad15176dc36d9c1d3246 Apr 21 15:08:01.101554 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:01.101465 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-f4f6d6d76-gvtk6" event={"ID":"655fd2c5-bba3-477c-b11b-1515bc56c276","Type":"ContainerStarted","Data":"ca6953c57b208540ad4da0beeca6c1ef2c3ad2c30313ad15176dc36d9c1d3246"} Apr 21 15:08:01.419294 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:01.419164 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:08:03.112983 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:03.112933 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-f4f6d6d76-gvtk6" event={"ID":"655fd2c5-bba3-477c-b11b-1515bc56c276","Type":"ContainerStarted","Data":"e2bb3d2f708c6df36ecc7419ee988265dc935af46ea4695a4480d052e3553d52"} Apr 21 15:08:03.113802 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:03.113773 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-f4f6d6d76-gvtk6" Apr 21 15:08:03.137707 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:03.137655 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-f4f6d6d76-gvtk6" podStartSLOduration=1.413005405 podStartE2EDuration="3.13763974s" podCreationTimestamp="2026-04-21 15:08:00 +0000 UTC" firstStartedPulling="2026-04-21 15:08:00.961321399 +0000 UTC m=+727.096298986" lastFinishedPulling="2026-04-21 15:08:02.68595573 +0000 UTC m=+728.820933321" observedRunningTime="2026-04-21 15:08:03.137285348 +0000 UTC m=+729.272262958" watchObservedRunningTime="2026-04-21 15:08:03.13763974 +0000 UTC m=+729.272617348" Apr 21 15:08:10.102333 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.102297 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-g2gkj" Apr 21 15:08:10.126948 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.126918 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-f4f6d6d76-gvtk6" Apr 21 15:08:10.183843 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.183809 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-g2gkj"] Apr 21 15:08:10.184071 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.184040 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-g2gkj" podUID="205e368b-ebfa-453c-b9ac-1a90a2447afb" containerName="manager" containerID="cri-o://1054b996da01ee2de52627e3e1e5d93486e34304856975535c09aaaf840e35e2" gracePeriod=10 Apr 21 15:08:10.432355 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.432333 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-g2gkj" Apr 21 15:08:10.504270 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.504213 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-86d7574f74-nh66m"] Apr 21 15:08:10.504641 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.504627 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="205e368b-ebfa-453c-b9ac-1a90a2447afb" containerName="manager" Apr 21 15:08:10.504641 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.504642 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="205e368b-ebfa-453c-b9ac-1a90a2447afb" containerName="manager" Apr 21 15:08:10.504747 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.504718 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="205e368b-ebfa-453c-b9ac-1a90a2447afb" containerName="manager" Apr 21 15:08:10.507845 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.507828 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-86d7574f74-nh66m" Apr 21 15:08:10.525050 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.525023 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-86d7574f74-nh66m"] Apr 21 15:08:10.553777 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.553741 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brsqd\" (UniqueName: \"kubernetes.io/projected/205e368b-ebfa-453c-b9ac-1a90a2447afb-kube-api-access-brsqd\") pod \"205e368b-ebfa-453c-b9ac-1a90a2447afb\" (UID: \"205e368b-ebfa-453c-b9ac-1a90a2447afb\") " Apr 21 15:08:10.555924 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.555888 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/205e368b-ebfa-453c-b9ac-1a90a2447afb-kube-api-access-brsqd" (OuterVolumeSpecName: "kube-api-access-brsqd") pod "205e368b-ebfa-453c-b9ac-1a90a2447afb" (UID: "205e368b-ebfa-453c-b9ac-1a90a2447afb"). InnerVolumeSpecName "kube-api-access-brsqd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:08:10.655204 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.655118 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nr2m\" (UniqueName: \"kubernetes.io/projected/918447b2-fcd0-4158-a5cb-22b918fced28-kube-api-access-7nr2m\") pod \"maas-controller-86d7574f74-nh66m\" (UID: \"918447b2-fcd0-4158-a5cb-22b918fced28\") " pod="opendatahub/maas-controller-86d7574f74-nh66m" Apr 21 15:08:10.655372 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.655219 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-brsqd\" (UniqueName: \"kubernetes.io/projected/205e368b-ebfa-453c-b9ac-1a90a2447afb-kube-api-access-brsqd\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:08:10.756032 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.755993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nr2m\" (UniqueName: \"kubernetes.io/projected/918447b2-fcd0-4158-a5cb-22b918fced28-kube-api-access-7nr2m\") pod \"maas-controller-86d7574f74-nh66m\" (UID: \"918447b2-fcd0-4158-a5cb-22b918fced28\") " pod="opendatahub/maas-controller-86d7574f74-nh66m" Apr 21 15:08:10.766733 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.766704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nr2m\" (UniqueName: \"kubernetes.io/projected/918447b2-fcd0-4158-a5cb-22b918fced28-kube-api-access-7nr2m\") pod \"maas-controller-86d7574f74-nh66m\" (UID: \"918447b2-fcd0-4158-a5cb-22b918fced28\") " pod="opendatahub/maas-controller-86d7574f74-nh66m" Apr 21 15:08:10.818945 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.818899 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-86d7574f74-nh66m" Apr 21 15:08:10.958940 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:10.958905 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-86d7574f74-nh66m"] Apr 21 15:08:10.960555 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:08:10.960528 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod918447b2_fcd0_4158_a5cb_22b918fced28.slice/crio-cd1cde07ebed18f40abd8780a38e3f8a30e1112d4fe585b4a27da92e5c9079ae WatchSource:0}: Error finding container cd1cde07ebed18f40abd8780a38e3f8a30e1112d4fe585b4a27da92e5c9079ae: Status 404 returned error can't find the container with id cd1cde07ebed18f40abd8780a38e3f8a30e1112d4fe585b4a27da92e5c9079ae Apr 21 15:08:11.143076 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:11.143040 2576 generic.go:358] "Generic (PLEG): container finished" podID="205e368b-ebfa-453c-b9ac-1a90a2447afb" containerID="1054b996da01ee2de52627e3e1e5d93486e34304856975535c09aaaf840e35e2" exitCode=0 Apr 21 15:08:11.143543 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:11.143099 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-g2gkj" Apr 21 15:08:11.143543 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:11.143111 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-g2gkj" event={"ID":"205e368b-ebfa-453c-b9ac-1a90a2447afb","Type":"ContainerDied","Data":"1054b996da01ee2de52627e3e1e5d93486e34304856975535c09aaaf840e35e2"} Apr 21 15:08:11.143543 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:11.143148 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-g2gkj" event={"ID":"205e368b-ebfa-453c-b9ac-1a90a2447afb","Type":"ContainerDied","Data":"b6736a3cd92f09d67f3b2ae3ba1b9968f5318471dc0e21d6d90346270c4c5174"} Apr 21 15:08:11.143543 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:11.143164 2576 scope.go:117] "RemoveContainer" containerID="1054b996da01ee2de52627e3e1e5d93486e34304856975535c09aaaf840e35e2" Apr 21 15:08:11.144641 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:11.144614 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-86d7574f74-nh66m" event={"ID":"918447b2-fcd0-4158-a5cb-22b918fced28","Type":"ContainerStarted","Data":"cd1cde07ebed18f40abd8780a38e3f8a30e1112d4fe585b4a27da92e5c9079ae"} Apr 21 15:08:11.152072 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:11.152054 2576 scope.go:117] "RemoveContainer" containerID="1054b996da01ee2de52627e3e1e5d93486e34304856975535c09aaaf840e35e2" Apr 21 15:08:11.152437 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:08:11.152419 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1054b996da01ee2de52627e3e1e5d93486e34304856975535c09aaaf840e35e2\": container with ID starting with 1054b996da01ee2de52627e3e1e5d93486e34304856975535c09aaaf840e35e2 not found: ID does not exist" containerID="1054b996da01ee2de52627e3e1e5d93486e34304856975535c09aaaf840e35e2" Apr 21 15:08:11.152510 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:11.152443 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1054b996da01ee2de52627e3e1e5d93486e34304856975535c09aaaf840e35e2"} err="failed to get container status \"1054b996da01ee2de52627e3e1e5d93486e34304856975535c09aaaf840e35e2\": rpc error: code = NotFound desc = could not find container \"1054b996da01ee2de52627e3e1e5d93486e34304856975535c09aaaf840e35e2\": container with ID starting with 1054b996da01ee2de52627e3e1e5d93486e34304856975535c09aaaf840e35e2 not found: ID does not exist" Apr 21 15:08:11.175915 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:11.175886 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-g2gkj"] Apr 21 15:08:11.181400 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:11.181376 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-g2gkj"] Apr 21 15:08:12.150018 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:12.149981 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-86d7574f74-nh66m" event={"ID":"918447b2-fcd0-4158-a5cb-22b918fced28","Type":"ContainerStarted","Data":"3c88f3021e36842c4c70e518bd6a1bce56092e631873896cc72f1a875e198db8"} Apr 21 15:08:12.150477 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:12.150098 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-86d7574f74-nh66m" Apr 21 15:08:12.169542 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:12.169488 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-86d7574f74-nh66m" podStartSLOduration=1.838244856 podStartE2EDuration="2.169475366s" podCreationTimestamp="2026-04-21 15:08:10 +0000 UTC" firstStartedPulling="2026-04-21 15:08:10.961779531 +0000 UTC m=+737.096757118" lastFinishedPulling="2026-04-21 15:08:11.293010042 +0000 UTC m=+737.427987628" observedRunningTime="2026-04-21 15:08:12.169208711 +0000 UTC m=+738.304186320" watchObservedRunningTime="2026-04-21 15:08:12.169475366 +0000 UTC m=+738.304452973" Apr 21 15:08:12.451787 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:12.451703 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="205e368b-ebfa-453c-b9ac-1a90a2447afb" path="/var/lib/kubelet/pods/205e368b-ebfa-453c-b9ac-1a90a2447afb/volumes" Apr 21 15:08:23.165603 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:23.165573 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-86d7574f74-nh66m" Apr 21 15:08:55.397276 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:55.397211 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-f4f6d6d76-gvtk6"] Apr 21 15:08:55.397840 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:55.397562 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-f4f6d6d76-gvtk6" podUID="655fd2c5-bba3-477c-b11b-1515bc56c276" containerName="maas-api" containerID="cri-o://e2bb3d2f708c6df36ecc7419ee988265dc935af46ea4695a4480d052e3553d52" gracePeriod=30 Apr 21 15:08:55.648875 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:55.648814 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-f4f6d6d76-gvtk6" Apr 21 15:08:55.772833 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:55.772803 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/655fd2c5-bba3-477c-b11b-1515bc56c276-maas-api-tls\") pod \"655fd2c5-bba3-477c-b11b-1515bc56c276\" (UID: \"655fd2c5-bba3-477c-b11b-1515bc56c276\") " Apr 21 15:08:55.773013 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:55.772859 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72wm2\" (UniqueName: \"kubernetes.io/projected/655fd2c5-bba3-477c-b11b-1515bc56c276-kube-api-access-72wm2\") pod \"655fd2c5-bba3-477c-b11b-1515bc56c276\" (UID: \"655fd2c5-bba3-477c-b11b-1515bc56c276\") " Apr 21 15:08:55.775034 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:55.774997 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/655fd2c5-bba3-477c-b11b-1515bc56c276-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "655fd2c5-bba3-477c-b11b-1515bc56c276" (UID: "655fd2c5-bba3-477c-b11b-1515bc56c276"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:08:55.775034 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:55.775021 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/655fd2c5-bba3-477c-b11b-1515bc56c276-kube-api-access-72wm2" (OuterVolumeSpecName: "kube-api-access-72wm2") pod "655fd2c5-bba3-477c-b11b-1515bc56c276" (UID: "655fd2c5-bba3-477c-b11b-1515bc56c276"). InnerVolumeSpecName "kube-api-access-72wm2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:08:55.874395 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:55.874364 2576 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/655fd2c5-bba3-477c-b11b-1515bc56c276-maas-api-tls\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:08:55.874395 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:55.874390 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-72wm2\" (UniqueName: \"kubernetes.io/projected/655fd2c5-bba3-477c-b11b-1515bc56c276-kube-api-access-72wm2\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:08:56.324254 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:56.324195 2576 generic.go:358] "Generic (PLEG): container finished" podID="655fd2c5-bba3-477c-b11b-1515bc56c276" containerID="e2bb3d2f708c6df36ecc7419ee988265dc935af46ea4695a4480d052e3553d52" exitCode=0 Apr 21 15:08:56.324456 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:56.324273 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-f4f6d6d76-gvtk6" Apr 21 15:08:56.324456 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:56.324261 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-f4f6d6d76-gvtk6" event={"ID":"655fd2c5-bba3-477c-b11b-1515bc56c276","Type":"ContainerDied","Data":"e2bb3d2f708c6df36ecc7419ee988265dc935af46ea4695a4480d052e3553d52"} Apr 21 15:08:56.324456 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:56.324390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-f4f6d6d76-gvtk6" event={"ID":"655fd2c5-bba3-477c-b11b-1515bc56c276","Type":"ContainerDied","Data":"ca6953c57b208540ad4da0beeca6c1ef2c3ad2c30313ad15176dc36d9c1d3246"} Apr 21 15:08:56.324456 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:56.324411 2576 scope.go:117] "RemoveContainer" containerID="e2bb3d2f708c6df36ecc7419ee988265dc935af46ea4695a4480d052e3553d52" Apr 21 15:08:56.333738 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:56.333719 2576 scope.go:117] "RemoveContainer" containerID="e2bb3d2f708c6df36ecc7419ee988265dc935af46ea4695a4480d052e3553d52" Apr 21 15:08:56.333996 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:08:56.333980 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2bb3d2f708c6df36ecc7419ee988265dc935af46ea4695a4480d052e3553d52\": container with ID starting with e2bb3d2f708c6df36ecc7419ee988265dc935af46ea4695a4480d052e3553d52 not found: ID does not exist" containerID="e2bb3d2f708c6df36ecc7419ee988265dc935af46ea4695a4480d052e3553d52" Apr 21 15:08:56.334043 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:56.334004 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2bb3d2f708c6df36ecc7419ee988265dc935af46ea4695a4480d052e3553d52"} err="failed to get container status \"e2bb3d2f708c6df36ecc7419ee988265dc935af46ea4695a4480d052e3553d52\": rpc error: code = NotFound desc = could not find container \"e2bb3d2f708c6df36ecc7419ee988265dc935af46ea4695a4480d052e3553d52\": container with ID starting with e2bb3d2f708c6df36ecc7419ee988265dc935af46ea4695a4480d052e3553d52 not found: ID does not exist" Apr 21 15:08:56.347442 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:56.347418 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-f4f6d6d76-gvtk6"] Apr 21 15:08:56.351713 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:56.351693 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-f4f6d6d76-gvtk6"] Apr 21 15:08:56.451588 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:08:56.451557 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="655fd2c5-bba3-477c-b11b-1515bc56c276" path="/var/lib/kubelet/pods/655fd2c5-bba3-477c-b11b-1515bc56c276/volumes" Apr 21 15:09:34.044750 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.044706 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g"] Apr 21 15:09:34.045377 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.045357 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="655fd2c5-bba3-477c-b11b-1515bc56c276" containerName="maas-api" Apr 21 15:09:34.045475 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.045379 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="655fd2c5-bba3-477c-b11b-1515bc56c276" containerName="maas-api" Apr 21 15:09:34.045536 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.045477 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="655fd2c5-bba3-477c-b11b-1515bc56c276" containerName="maas-api" Apr 21 15:09:34.048844 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.048823 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.056630 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.056606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-qj6x9\"" Apr 21 15:09:34.056877 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.056858 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 15:09:34.056956 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.056895 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 15:09:34.057018 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.056952 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 21 15:09:34.066932 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.066909 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g"] Apr 21 15:09:34.131979 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.131940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzm5p\" (UniqueName: \"kubernetes.io/projected/6ef91b59-0ad5-4b29-9123-dfda088056c4-kube-api-access-mzm5p\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.131979 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.131988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6ef91b59-0ad5-4b29-9123-dfda088056c4-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.132283 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.132051 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6ef91b59-0ad5-4b29-9123-dfda088056c4-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.132283 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.132157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef91b59-0ad5-4b29-9123-dfda088056c4-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.132283 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.132203 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6ef91b59-0ad5-4b29-9123-dfda088056c4-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.132283 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.132268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ef91b59-0ad5-4b29-9123-dfda088056c4-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.232759 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.232716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef91b59-0ad5-4b29-9123-dfda088056c4-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.232956 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.232775 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6ef91b59-0ad5-4b29-9123-dfda088056c4-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.232956 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.232818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ef91b59-0ad5-4b29-9123-dfda088056c4-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.232956 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.232849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzm5p\" (UniqueName: \"kubernetes.io/projected/6ef91b59-0ad5-4b29-9123-dfda088056c4-kube-api-access-mzm5p\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.233117 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.232987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6ef91b59-0ad5-4b29-9123-dfda088056c4-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.233117 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.233039 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6ef91b59-0ad5-4b29-9123-dfda088056c4-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.233222 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.233187 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6ef91b59-0ad5-4b29-9123-dfda088056c4-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.233326 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.233262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ef91b59-0ad5-4b29-9123-dfda088056c4-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.233326 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.233310 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6ef91b59-0ad5-4b29-9123-dfda088056c4-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.235307 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.235278 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6ef91b59-0ad5-4b29-9123-dfda088056c4-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.235468 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.235452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef91b59-0ad5-4b29-9123-dfda088056c4-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.245958 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.245932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzm5p\" (UniqueName: \"kubernetes.io/projected/6ef91b59-0ad5-4b29-9123-dfda088056c4-kube-api-access-mzm5p\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zs92g\" (UID: \"6ef91b59-0ad5-4b29-9123-dfda088056c4\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.357965 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.357869 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:34.507468 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.503851 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g"] Apr 21 15:09:34.508532 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:09:34.508502 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ef91b59_0ad5_4b29_9123_dfda088056c4.slice/crio-2ac6625f2adaa5eba369676c076d54918a432238a6f038e925a359e6e4036985 WatchSource:0}: Error finding container 2ac6625f2adaa5eba369676c076d54918a432238a6f038e925a359e6e4036985: Status 404 returned error can't find the container with id 2ac6625f2adaa5eba369676c076d54918a432238a6f038e925a359e6e4036985 Apr 21 15:09:34.563009 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:34.562976 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:09:35.484406 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:35.484367 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" event={"ID":"6ef91b59-0ad5-4b29-9123-dfda088056c4","Type":"ContainerStarted","Data":"2ac6625f2adaa5eba369676c076d54918a432238a6f038e925a359e6e4036985"} Apr 21 15:09:39.885218 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:39.885180 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:09:42.520962 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:42.520923 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" event={"ID":"6ef91b59-0ad5-4b29-9123-dfda088056c4","Type":"ContainerStarted","Data":"01b9b553f57b0f8a84a1989b24b9ff63caf8657e65ac958a33ec0ab256124545"} Apr 21 15:09:42.943922 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:42.943827 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:09:52.560460 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:52.560425 2576 generic.go:358] "Generic (PLEG): container finished" podID="6ef91b59-0ad5-4b29-9123-dfda088056c4" containerID="01b9b553f57b0f8a84a1989b24b9ff63caf8657e65ac958a33ec0ab256124545" exitCode=0 Apr 21 15:09:52.560808 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:52.560495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" event={"ID":"6ef91b59-0ad5-4b29-9123-dfda088056c4","Type":"ContainerDied","Data":"01b9b553f57b0f8a84a1989b24b9ff63caf8657e65ac958a33ec0ab256124545"} Apr 21 15:09:54.568860 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:54.568823 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" event={"ID":"6ef91b59-0ad5-4b29-9123-dfda088056c4","Type":"ContainerStarted","Data":"a1536c0697466b20f9f11b62b3aa644836c2bf9e516791f2eed055eeec1530cc"} Apr 21 15:09:54.569226 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:54.569010 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:09:54.588917 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:54.588870 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" podStartSLOduration=1.347705859 podStartE2EDuration="20.588857175s" podCreationTimestamp="2026-04-21 15:09:34 +0000 UTC" firstStartedPulling="2026-04-21 15:09:34.510706394 +0000 UTC m=+820.645683985" lastFinishedPulling="2026-04-21 15:09:53.7518577 +0000 UTC m=+839.886835301" observedRunningTime="2026-04-21 15:09:54.586800825 +0000 UTC m=+840.721778434" watchObservedRunningTime="2026-04-21 15:09:54.588857175 +0000 UTC m=+840.723834786" Apr 21 15:09:57.459020 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.458985 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz"] Apr 21 15:09:57.470580 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.470550 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.474173 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.473378 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz"] Apr 21 15:09:57.474173 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.473584 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 21 15:09:57.474173 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.473927 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ff75fd08-1777-4ba9-8743-939ef551b3cd-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.575292 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.575219 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff75fd08-1777-4ba9-8743-939ef551b3cd-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.575467 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.575312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml6mg\" (UniqueName: \"kubernetes.io/projected/ff75fd08-1777-4ba9-8743-939ef551b3cd-kube-api-access-ml6mg\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.575467 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.575369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ff75fd08-1777-4ba9-8743-939ef551b3cd-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.575467 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.575448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff75fd08-1777-4ba9-8743-939ef551b3cd-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.575660 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.575516 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ff75fd08-1777-4ba9-8743-939ef551b3cd-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.575660 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.575562 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff75fd08-1777-4ba9-8743-939ef551b3cd-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.575768 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.575736 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ff75fd08-1777-4ba9-8743-939ef551b3cd-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.676796 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.676753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff75fd08-1777-4ba9-8743-939ef551b3cd-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.676951 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.676836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ff75fd08-1777-4ba9-8743-939ef551b3cd-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.676951 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.676888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff75fd08-1777-4ba9-8743-939ef551b3cd-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.676951 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.676937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff75fd08-1777-4ba9-8743-939ef551b3cd-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.677098 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.676978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ml6mg\" (UniqueName: \"kubernetes.io/projected/ff75fd08-1777-4ba9-8743-939ef551b3cd-kube-api-access-ml6mg\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.677153 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.677100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff75fd08-1777-4ba9-8743-939ef551b3cd-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.677291 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.677234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff75fd08-1777-4ba9-8743-939ef551b3cd-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.679316 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.679292 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ff75fd08-1777-4ba9-8743-939ef551b3cd-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.679584 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.679564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff75fd08-1777-4ba9-8743-939ef551b3cd-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.685576 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.685557 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml6mg\" (UniqueName: \"kubernetes.io/projected/ff75fd08-1777-4ba9-8743-939ef551b3cd-kube-api-access-ml6mg\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz\" (UID: \"ff75fd08-1777-4ba9-8743-939ef551b3cd\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.794573 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.794534 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:09:57.941176 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:57.941144 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz"] Apr 21 15:09:57.943008 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:09:57.942980 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff75fd08_1777_4ba9_8743_939ef551b3cd.slice/crio-7df18ca7800971dda775cfc6753210159674edeb81affda1ea34d56a807f2a78 WatchSource:0}: Error finding container 7df18ca7800971dda775cfc6753210159674edeb81affda1ea34d56a807f2a78: Status 404 returned error can't find the container with id 7df18ca7800971dda775cfc6753210159674edeb81affda1ea34d56a807f2a78 Apr 21 15:09:58.585085 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:58.585047 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" event={"ID":"ff75fd08-1777-4ba9-8743-939ef551b3cd","Type":"ContainerStarted","Data":"56238e4b886d8d5afa5e1775b3997fe5afb419d9318c7c655b571a49a7494bd2"} Apr 21 15:09:58.585085 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:58.585091 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" event={"ID":"ff75fd08-1777-4ba9-8743-939ef551b3cd","Type":"ContainerStarted","Data":"7df18ca7800971dda775cfc6753210159674edeb81affda1ea34d56a807f2a78"} Apr 21 15:09:59.376184 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:09:59.376151 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:10:04.611455 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:10:04.611412 2576 generic.go:358] "Generic (PLEG): container finished" podID="ff75fd08-1777-4ba9-8743-939ef551b3cd" containerID="56238e4b886d8d5afa5e1775b3997fe5afb419d9318c7c655b571a49a7494bd2" exitCode=0 Apr 21 15:10:04.611920 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:10:04.611493 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" event={"ID":"ff75fd08-1777-4ba9-8743-939ef551b3cd","Type":"ContainerDied","Data":"56238e4b886d8d5afa5e1775b3997fe5afb419d9318c7c655b571a49a7494bd2"} Apr 21 15:10:05.586445 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:10:05.586414 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zs92g" Apr 21 15:10:05.618497 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:10:05.618464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" event={"ID":"ff75fd08-1777-4ba9-8743-939ef551b3cd","Type":"ContainerStarted","Data":"cac33ff88ca8fdd461134dfb91f99bb74afd68776e835ce3e083a7bc876b85df"} Apr 21 15:10:05.618916 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:10:05.618693 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:10:05.645483 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:10:05.645421 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" podStartSLOduration=8.15302547 podStartE2EDuration="8.645402531s" podCreationTimestamp="2026-04-21 15:09:57 +0000 UTC" firstStartedPulling="2026-04-21 15:10:04.612415864 +0000 UTC m=+850.747393469" lastFinishedPulling="2026-04-21 15:10:05.104792917 +0000 UTC m=+851.239770530" observedRunningTime="2026-04-21 15:10:05.643916508 +0000 UTC m=+851.778894118" watchObservedRunningTime="2026-04-21 15:10:05.645402531 +0000 UTC m=+851.780380140" Apr 21 15:10:16.636327 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:10:16.636297 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz" Apr 21 15:10:22.180336 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:10:22.180298 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:10:38.871385 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:10:38.871347 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:12:10.773807 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:12:10.773772 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:12:20.463056 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:12:20.462976 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:12:29.272112 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:12:29.272065 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:12:39.991516 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:12:39.991478 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:12:49.219317 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:12:49.219271 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:13:00.598869 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:00.598833 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:13:06.715441 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:06.715397 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-86d7574f74-nh66m"] Apr 21 15:13:06.715865 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:06.715665 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-86d7574f74-nh66m" podUID="918447b2-fcd0-4158-a5cb-22b918fced28" containerName="manager" containerID="cri-o://3c88f3021e36842c4c70e518bd6a1bce56092e631873896cc72f1a875e198db8" gracePeriod=10 Apr 21 15:13:06.968804 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:06.968736 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-86d7574f74-nh66m" Apr 21 15:13:07.077659 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:07.077618 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nr2m\" (UniqueName: \"kubernetes.io/projected/918447b2-fcd0-4158-a5cb-22b918fced28-kube-api-access-7nr2m\") pod \"918447b2-fcd0-4158-a5cb-22b918fced28\" (UID: \"918447b2-fcd0-4158-a5cb-22b918fced28\") " Apr 21 15:13:07.079918 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:07.079891 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/918447b2-fcd0-4158-a5cb-22b918fced28-kube-api-access-7nr2m" (OuterVolumeSpecName: "kube-api-access-7nr2m") pod "918447b2-fcd0-4158-a5cb-22b918fced28" (UID: "918447b2-fcd0-4158-a5cb-22b918fced28"). InnerVolumeSpecName "kube-api-access-7nr2m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:13:07.178749 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:07.178717 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7nr2m\" (UniqueName: \"kubernetes.io/projected/918447b2-fcd0-4158-a5cb-22b918fced28-kube-api-access-7nr2m\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:13:07.327938 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:07.327903 2576 generic.go:358] "Generic (PLEG): container finished" podID="918447b2-fcd0-4158-a5cb-22b918fced28" containerID="3c88f3021e36842c4c70e518bd6a1bce56092e631873896cc72f1a875e198db8" exitCode=0 Apr 21 15:13:07.328126 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:07.327967 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-86d7574f74-nh66m" event={"ID":"918447b2-fcd0-4158-a5cb-22b918fced28","Type":"ContainerDied","Data":"3c88f3021e36842c4c70e518bd6a1bce56092e631873896cc72f1a875e198db8"} Apr 21 15:13:07.328126 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:07.327996 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-86d7574f74-nh66m" event={"ID":"918447b2-fcd0-4158-a5cb-22b918fced28","Type":"ContainerDied","Data":"cd1cde07ebed18f40abd8780a38e3f8a30e1112d4fe585b4a27da92e5c9079ae"} Apr 21 15:13:07.328126 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:07.328016 2576 scope.go:117] "RemoveContainer" containerID="3c88f3021e36842c4c70e518bd6a1bce56092e631873896cc72f1a875e198db8" Apr 21 15:13:07.328126 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:07.327969 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-86d7574f74-nh66m" Apr 21 15:13:07.338518 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:07.338494 2576 scope.go:117] "RemoveContainer" containerID="3c88f3021e36842c4c70e518bd6a1bce56092e631873896cc72f1a875e198db8" Apr 21 15:13:07.338801 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:13:07.338779 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c88f3021e36842c4c70e518bd6a1bce56092e631873896cc72f1a875e198db8\": container with ID starting with 3c88f3021e36842c4c70e518bd6a1bce56092e631873896cc72f1a875e198db8 not found: ID does not exist" containerID="3c88f3021e36842c4c70e518bd6a1bce56092e631873896cc72f1a875e198db8" Apr 21 15:13:07.338874 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:07.338814 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c88f3021e36842c4c70e518bd6a1bce56092e631873896cc72f1a875e198db8"} err="failed to get container status \"3c88f3021e36842c4c70e518bd6a1bce56092e631873896cc72f1a875e198db8\": rpc error: code = NotFound desc = could not find container \"3c88f3021e36842c4c70e518bd6a1bce56092e631873896cc72f1a875e198db8\": container with ID starting with 3c88f3021e36842c4c70e518bd6a1bce56092e631873896cc72f1a875e198db8 not found: ID does not exist" Apr 21 15:13:07.376136 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:07.376092 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-86d7574f74-nh66m"] Apr 21 15:13:07.386692 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:07.386653 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-86d7574f74-nh66m"] Apr 21 15:13:08.160156 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:08.160123 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-86d7574f74-4cmcv"] Apr 21 15:13:08.160629 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:08.160614 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="918447b2-fcd0-4158-a5cb-22b918fced28" containerName="manager" Apr 21 15:13:08.160683 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:08.160631 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="918447b2-fcd0-4158-a5cb-22b918fced28" containerName="manager" Apr 21 15:13:08.160725 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:08.160706 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="918447b2-fcd0-4158-a5cb-22b918fced28" containerName="manager" Apr 21 15:13:08.165009 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:08.164991 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-86d7574f74-4cmcv" Apr 21 15:13:08.167357 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:08.167338 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-jm2k7\"" Apr 21 15:13:08.183841 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:08.183821 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-86d7574f74-4cmcv"] Apr 21 15:13:08.288166 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:08.288125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwzj7\" (UniqueName: \"kubernetes.io/projected/10e56336-552f-427b-bf92-e8b259d808b4-kube-api-access-bwzj7\") pod \"maas-controller-86d7574f74-4cmcv\" (UID: \"10e56336-552f-427b-bf92-e8b259d808b4\") " pod="opendatahub/maas-controller-86d7574f74-4cmcv" Apr 21 15:13:08.389289 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:08.389224 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwzj7\" (UniqueName: \"kubernetes.io/projected/10e56336-552f-427b-bf92-e8b259d808b4-kube-api-access-bwzj7\") pod \"maas-controller-86d7574f74-4cmcv\" (UID: \"10e56336-552f-427b-bf92-e8b259d808b4\") " pod="opendatahub/maas-controller-86d7574f74-4cmcv" Apr 21 15:13:08.397792 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:08.397756 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwzj7\" (UniqueName: \"kubernetes.io/projected/10e56336-552f-427b-bf92-e8b259d808b4-kube-api-access-bwzj7\") pod \"maas-controller-86d7574f74-4cmcv\" (UID: \"10e56336-552f-427b-bf92-e8b259d808b4\") " pod="opendatahub/maas-controller-86d7574f74-4cmcv" Apr 21 15:13:08.452578 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:08.452501 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="918447b2-fcd0-4158-a5cb-22b918fced28" path="/var/lib/kubelet/pods/918447b2-fcd0-4158-a5cb-22b918fced28/volumes" Apr 21 15:13:08.474670 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:08.474624 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-86d7574f74-4cmcv" Apr 21 15:13:08.615298 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:08.615264 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-86d7574f74-4cmcv"] Apr 21 15:13:08.620147 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:13:08.620118 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10e56336_552f_427b_bf92_e8b259d808b4.slice/crio-98af630814a6335562187654abd50df9a1ede3b63bb1e08a1af53d005b2bed69 WatchSource:0}: Error finding container 98af630814a6335562187654abd50df9a1ede3b63bb1e08a1af53d005b2bed69: Status 404 returned error can't find the container with id 98af630814a6335562187654abd50df9a1ede3b63bb1e08a1af53d005b2bed69 Apr 21 15:13:08.621534 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:08.621515 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:13:09.337933 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:09.337894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-86d7574f74-4cmcv" event={"ID":"10e56336-552f-427b-bf92-e8b259d808b4","Type":"ContainerStarted","Data":"bf6033f94000b7de3946ea02ba90291238974f150af332131e473516def6dae3"} Apr 21 15:13:09.337933 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:09.337932 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-86d7574f74-4cmcv" event={"ID":"10e56336-552f-427b-bf92-e8b259d808b4","Type":"ContainerStarted","Data":"98af630814a6335562187654abd50df9a1ede3b63bb1e08a1af53d005b2bed69"} Apr 21 15:13:09.338390 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:09.337982 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-86d7574f74-4cmcv" Apr 21 15:13:09.355672 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:09.355565 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-86d7574f74-4cmcv" podStartSLOduration=0.87728712 podStartE2EDuration="1.355550556s" podCreationTimestamp="2026-04-21 15:13:08 +0000 UTC" firstStartedPulling="2026-04-21 15:13:08.621639465 +0000 UTC m=+1034.756617057" lastFinishedPulling="2026-04-21 15:13:09.099902903 +0000 UTC m=+1035.234880493" observedRunningTime="2026-04-21 15:13:09.353855856 +0000 UTC m=+1035.488833464" watchObservedRunningTime="2026-04-21 15:13:09.355550556 +0000 UTC m=+1035.490528162" Apr 21 15:13:20.347850 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:13:20.347814 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-86d7574f74-4cmcv" Apr 21 15:14:02.007973 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:14:02.007934 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:14:16.873799 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:14:16.873756 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:14:56.611301 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:14:56.611258 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:15:00.153012 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:00.152961 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29613075-fhmh9"] Apr 21 15:15:00.156180 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:00.156155 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" Apr 21 15:15:00.158585 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:00.158562 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-xx77w\"" Apr 21 15:15:00.173679 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:00.173648 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx4r2\" (UniqueName: \"kubernetes.io/projected/9eb900ab-319f-48d8-b6be-bd6bc9cd0989-kube-api-access-lx4r2\") pod \"maas-api-key-cleanup-29613075-fhmh9\" (UID: \"9eb900ab-319f-48d8-b6be-bd6bc9cd0989\") " pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" Apr 21 15:15:00.176388 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:00.176366 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613075-fhmh9"] Apr 21 15:15:00.274516 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:00.274480 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lx4r2\" (UniqueName: \"kubernetes.io/projected/9eb900ab-319f-48d8-b6be-bd6bc9cd0989-kube-api-access-lx4r2\") pod \"maas-api-key-cleanup-29613075-fhmh9\" (UID: \"9eb900ab-319f-48d8-b6be-bd6bc9cd0989\") " pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" Apr 21 15:15:00.282847 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:00.282820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx4r2\" (UniqueName: \"kubernetes.io/projected/9eb900ab-319f-48d8-b6be-bd6bc9cd0989-kube-api-access-lx4r2\") pod \"maas-api-key-cleanup-29613075-fhmh9\" (UID: \"9eb900ab-319f-48d8-b6be-bd6bc9cd0989\") " pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" Apr 21 15:15:00.466862 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:00.466759 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" Apr 21 15:15:00.597217 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:00.597188 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613075-fhmh9"] Apr 21 15:15:00.599559 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:15:00.599531 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9eb900ab_319f_48d8_b6be_bd6bc9cd0989.slice/crio-9277d91e638073bc0b9e44579fdad4b5073682b12015b1fd4c4cc32f9802e1ba WatchSource:0}: Error finding container 9277d91e638073bc0b9e44579fdad4b5073682b12015b1fd4c4cc32f9802e1ba: Status 404 returned error can't find the container with id 9277d91e638073bc0b9e44579fdad4b5073682b12015b1fd4c4cc32f9802e1ba Apr 21 15:15:00.766874 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:00.766832 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" event={"ID":"9eb900ab-319f-48d8-b6be-bd6bc9cd0989","Type":"ContainerStarted","Data":"9277d91e638073bc0b9e44579fdad4b5073682b12015b1fd4c4cc32f9802e1ba"} Apr 21 15:15:02.776893 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:02.776857 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" event={"ID":"9eb900ab-319f-48d8-b6be-bd6bc9cd0989","Type":"ContainerStarted","Data":"ac0f9b407a42b18a73c1e0c7f989853ee9efdbd048f9e20171d5f7905f316668"} Apr 21 15:15:02.793683 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:02.793628 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" podStartSLOduration=1.39470766 podStartE2EDuration="2.793611311s" podCreationTimestamp="2026-04-21 15:15:00 +0000 UTC" firstStartedPulling="2026-04-21 15:15:00.601936764 +0000 UTC m=+1146.736914366" lastFinishedPulling="2026-04-21 15:15:02.000840409 +0000 UTC m=+1148.135818017" observedRunningTime="2026-04-21 15:15:02.791027537 +0000 UTC m=+1148.926005158" watchObservedRunningTime="2026-04-21 15:15:02.793611311 +0000 UTC m=+1148.928588919" Apr 21 15:15:14.572210 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:14.572128 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:15:22.863585 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:22.863547 2576 generic.go:358] "Generic (PLEG): container finished" podID="9eb900ab-319f-48d8-b6be-bd6bc9cd0989" containerID="ac0f9b407a42b18a73c1e0c7f989853ee9efdbd048f9e20171d5f7905f316668" exitCode=6 Apr 21 15:15:22.863954 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:22.863620 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" event={"ID":"9eb900ab-319f-48d8-b6be-bd6bc9cd0989","Type":"ContainerDied","Data":"ac0f9b407a42b18a73c1e0c7f989853ee9efdbd048f9e20171d5f7905f316668"} Apr 21 15:15:22.863997 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:22.863981 2576 scope.go:117] "RemoveContainer" containerID="ac0f9b407a42b18a73c1e0c7f989853ee9efdbd048f9e20171d5f7905f316668" Apr 21 15:15:23.870066 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:23.870023 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" event={"ID":"9eb900ab-319f-48d8-b6be-bd6bc9cd0989","Type":"ContainerStarted","Data":"ddee7faccea59a582a2bfa7ae9410c8911a9a79bd6a9bd72d97f6a37f14c336c"} Apr 21 15:15:28.364646 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:28.364611 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:15:43.946674 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:43.946579 2576 generic.go:358] "Generic (PLEG): container finished" podID="9eb900ab-319f-48d8-b6be-bd6bc9cd0989" containerID="ddee7faccea59a582a2bfa7ae9410c8911a9a79bd6a9bd72d97f6a37f14c336c" exitCode=6 Apr 21 15:15:43.946674 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:43.946651 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" event={"ID":"9eb900ab-319f-48d8-b6be-bd6bc9cd0989","Type":"ContainerDied","Data":"ddee7faccea59a582a2bfa7ae9410c8911a9a79bd6a9bd72d97f6a37f14c336c"} Apr 21 15:15:43.947171 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:43.946698 2576 scope.go:117] "RemoveContainer" containerID="ac0f9b407a42b18a73c1e0c7f989853ee9efdbd048f9e20171d5f7905f316668" Apr 21 15:15:43.947171 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:43.947047 2576 scope.go:117] "RemoveContainer" containerID="ddee7faccea59a582a2bfa7ae9410c8911a9a79bd6a9bd72d97f6a37f14c336c" Apr 21 15:15:43.947378 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:15:43.947354 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29613075-fhmh9_opendatahub(9eb900ab-319f-48d8-b6be-bd6bc9cd0989)\"" pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" podUID="9eb900ab-319f-48d8-b6be-bd6bc9cd0989" Apr 21 15:15:45.216290 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:45.216234 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:15:59.446752 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:15:59.446719 2576 scope.go:117] "RemoveContainer" containerID="ddee7faccea59a582a2bfa7ae9410c8911a9a79bd6a9bd72d97f6a37f14c336c" Apr 21 15:16:00.010152 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:00.010115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" event={"ID":"9eb900ab-319f-48d8-b6be-bd6bc9cd0989","Type":"ContainerStarted","Data":"81b89b7729f07901d155588f6c0a80250029461667c7041325db7f0b7e34bfff"} Apr 21 15:16:00.013435 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:00.013408 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613075-fhmh9"] Apr 21 15:16:01.013978 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:01.013938 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" podUID="9eb900ab-319f-48d8-b6be-bd6bc9cd0989" containerName="cleanup" containerID="cri-o://81b89b7729f07901d155588f6c0a80250029461667c7041325db7f0b7e34bfff" gracePeriod=30 Apr 21 15:16:20.089988 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:20.089803 2576 generic.go:358] "Generic (PLEG): container finished" podID="9eb900ab-319f-48d8-b6be-bd6bc9cd0989" containerID="81b89b7729f07901d155588f6c0a80250029461667c7041325db7f0b7e34bfff" exitCode=6 Apr 21 15:16:20.089988 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:20.089924 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" event={"ID":"9eb900ab-319f-48d8-b6be-bd6bc9cd0989","Type":"ContainerDied","Data":"81b89b7729f07901d155588f6c0a80250029461667c7041325db7f0b7e34bfff"} Apr 21 15:16:20.089988 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:20.089966 2576 scope.go:117] "RemoveContainer" containerID="ddee7faccea59a582a2bfa7ae9410c8911a9a79bd6a9bd72d97f6a37f14c336c" Apr 21 15:16:20.168793 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:20.168766 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" Apr 21 15:16:20.258124 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:20.258086 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx4r2\" (UniqueName: \"kubernetes.io/projected/9eb900ab-319f-48d8-b6be-bd6bc9cd0989-kube-api-access-lx4r2\") pod \"9eb900ab-319f-48d8-b6be-bd6bc9cd0989\" (UID: \"9eb900ab-319f-48d8-b6be-bd6bc9cd0989\") " Apr 21 15:16:20.260316 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:20.260279 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eb900ab-319f-48d8-b6be-bd6bc9cd0989-kube-api-access-lx4r2" (OuterVolumeSpecName: "kube-api-access-lx4r2") pod "9eb900ab-319f-48d8-b6be-bd6bc9cd0989" (UID: "9eb900ab-319f-48d8-b6be-bd6bc9cd0989"). InnerVolumeSpecName "kube-api-access-lx4r2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:16:20.359608 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:20.359565 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lx4r2\" (UniqueName: \"kubernetes.io/projected/9eb900ab-319f-48d8-b6be-bd6bc9cd0989-kube-api-access-lx4r2\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:16:21.094958 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:21.094929 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" Apr 21 15:16:21.095442 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:21.094923 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613075-fhmh9" event={"ID":"9eb900ab-319f-48d8-b6be-bd6bc9cd0989","Type":"ContainerDied","Data":"9277d91e638073bc0b9e44579fdad4b5073682b12015b1fd4c4cc32f9802e1ba"} Apr 21 15:16:21.095442 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:21.095068 2576 scope.go:117] "RemoveContainer" containerID="81b89b7729f07901d155588f6c0a80250029461667c7041325db7f0b7e34bfff" Apr 21 15:16:21.111385 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:21.111359 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613075-fhmh9"] Apr 21 15:16:21.114179 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:21.114156 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613075-fhmh9"] Apr 21 15:16:22.452073 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:22.452033 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eb900ab-319f-48d8-b6be-bd6bc9cd0989" path="/var/lib/kubelet/pods/9eb900ab-319f-48d8-b6be-bd6bc9cd0989/volumes" Apr 21 15:16:39.491918 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:39.491880 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:16:47.860087 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:16:47.859999 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:17:04.774914 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:17:04.774880 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:17:13.287887 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:17:13.287818 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:17:30.172268 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:17:30.172206 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:17:38.455893 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:17:38.455862 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:18:11.182191 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:18:11.182151 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:18:20.159548 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:18:20.159462 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:18:24.059888 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:18:24.059849 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:18:28.485279 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:18:28.485231 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:18:36.757841 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:18:36.757798 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:18:45.759132 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:18:45.759081 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:19:02.074333 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:19:02.074294 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:19:15.862291 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:19:15.862232 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:20:03.456154 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:20:03.456063 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:20:11.861722 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:20:11.861686 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:20:21.256439 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:20:21.256400 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:20:29.282348 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:20:29.282309 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:20:39.074893 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:20:39.074847 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:20:46.658061 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:20:46.658020 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:20:56.066320 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:20:56.066280 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:21:04.458995 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:21:04.458955 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:21:13.469856 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:21:13.469768 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:21:22.075749 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:21:22.075712 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:21:31.660614 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:21:31.660580 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:21:40.263191 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:21:40.263150 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:21:48.759387 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:21:48.759347 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:21:57.160289 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:21:57.160235 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:22:06.186569 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:22:06.186533 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:22:14.269347 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:22:14.269304 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:22:23.063559 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:22:23.063520 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:22:31.672420 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:22:31.672386 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:23:25.632793 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:25.632698 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5"] Apr 21 15:23:25.633309 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:25.632935 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" podUID="51a8e561-a655-48b2-8ed6-57efb52a742b" containerName="manager" containerID="cri-o://5cfc2a871c3b1323282cbb3926c3d28566767baa2aa09c61943a76b42dd4188d" gracePeriod=10 Apr 21 15:23:25.749743 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:25.749711 2576 generic.go:358] "Generic (PLEG): container finished" podID="51a8e561-a655-48b2-8ed6-57efb52a742b" containerID="5cfc2a871c3b1323282cbb3926c3d28566767baa2aa09c61943a76b42dd4188d" exitCode=0 Apr 21 15:23:25.749891 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:25.749786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" event={"ID":"51a8e561-a655-48b2-8ed6-57efb52a742b","Type":"ContainerDied","Data":"5cfc2a871c3b1323282cbb3926c3d28566767baa2aa09c61943a76b42dd4188d"} Apr 21 15:23:25.891507 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:25.891441 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" Apr 21 15:23:26.031625 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:26.031585 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxwfh\" (UniqueName: \"kubernetes.io/projected/51a8e561-a655-48b2-8ed6-57efb52a742b-kube-api-access-vxwfh\") pod \"51a8e561-a655-48b2-8ed6-57efb52a742b\" (UID: \"51a8e561-a655-48b2-8ed6-57efb52a742b\") " Apr 21 15:23:26.031803 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:26.031671 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/51a8e561-a655-48b2-8ed6-57efb52a742b-extensions-socket-volume\") pod \"51a8e561-a655-48b2-8ed6-57efb52a742b\" (UID: \"51a8e561-a655-48b2-8ed6-57efb52a742b\") " Apr 21 15:23:26.032101 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:26.032074 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a8e561-a655-48b2-8ed6-57efb52a742b-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "51a8e561-a655-48b2-8ed6-57efb52a742b" (UID: "51a8e561-a655-48b2-8ed6-57efb52a742b"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:23:26.033874 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:26.033842 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a8e561-a655-48b2-8ed6-57efb52a742b-kube-api-access-vxwfh" (OuterVolumeSpecName: "kube-api-access-vxwfh") pod "51a8e561-a655-48b2-8ed6-57efb52a742b" (UID: "51a8e561-a655-48b2-8ed6-57efb52a742b"). InnerVolumeSpecName "kube-api-access-vxwfh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:23:26.133486 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:26.133437 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vxwfh\" (UniqueName: \"kubernetes.io/projected/51a8e561-a655-48b2-8ed6-57efb52a742b-kube-api-access-vxwfh\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:23:26.133486 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:26.133487 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/51a8e561-a655-48b2-8ed6-57efb52a742b-extensions-socket-volume\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:23:26.754619 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:26.754588 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" Apr 21 15:23:26.754619 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:26.754601 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5" event={"ID":"51a8e561-a655-48b2-8ed6-57efb52a742b","Type":"ContainerDied","Data":"4e9a0fe8c50b42759f866dd20a04a886c1e8b352c96a711ec40901183976236b"} Apr 21 15:23:26.755154 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:26.754643 2576 scope.go:117] "RemoveContainer" containerID="5cfc2a871c3b1323282cbb3926c3d28566767baa2aa09c61943a76b42dd4188d" Apr 21 15:23:26.770734 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:26.770671 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5"] Apr 21 15:23:26.777014 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:26.776990 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-dkhw5"] Apr 21 15:23:28.457623 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:23:28.457585 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a8e561-a655-48b2-8ed6-57efb52a742b" path="/var/lib/kubelet/pods/51a8e561-a655-48b2-8ed6-57efb52a742b/volumes" Apr 21 15:24:31.733475 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.733437 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm"] Apr 21 15:24:31.733943 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.733853 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51a8e561-a655-48b2-8ed6-57efb52a742b" containerName="manager" Apr 21 15:24:31.733943 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.733866 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a8e561-a655-48b2-8ed6-57efb52a742b" containerName="manager" Apr 21 15:24:31.733943 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.733874 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9eb900ab-319f-48d8-b6be-bd6bc9cd0989" containerName="cleanup" Apr 21 15:24:31.733943 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.733880 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb900ab-319f-48d8-b6be-bd6bc9cd0989" containerName="cleanup" Apr 21 15:24:31.733943 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.733891 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9eb900ab-319f-48d8-b6be-bd6bc9cd0989" containerName="cleanup" Apr 21 15:24:31.733943 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.733897 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb900ab-319f-48d8-b6be-bd6bc9cd0989" containerName="cleanup" Apr 21 15:24:31.733943 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.733905 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9eb900ab-319f-48d8-b6be-bd6bc9cd0989" containerName="cleanup" Apr 21 15:24:31.733943 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.733910 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb900ab-319f-48d8-b6be-bd6bc9cd0989" containerName="cleanup" Apr 21 15:24:31.734195 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.733964 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9eb900ab-319f-48d8-b6be-bd6bc9cd0989" containerName="cleanup" Apr 21 15:24:31.734195 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.733973 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="51a8e561-a655-48b2-8ed6-57efb52a742b" containerName="manager" Apr 21 15:24:31.734195 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.733981 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9eb900ab-319f-48d8-b6be-bd6bc9cd0989" containerName="cleanup" Apr 21 15:24:31.736934 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.736918 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm" Apr 21 15:24:31.739392 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.739374 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-lgvpp\"" Apr 21 15:24:31.750145 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.750121 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm"] Apr 21 15:24:31.873655 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.873616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9a74a1ef-f649-4cdf-8c94-a9609ac8cc10-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2xjcm\" (UID: \"9a74a1ef-f649-4cdf-8c94-a9609ac8cc10\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm" Apr 21 15:24:31.873898 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.873739 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ct9d\" (UniqueName: \"kubernetes.io/projected/9a74a1ef-f649-4cdf-8c94-a9609ac8cc10-kube-api-access-8ct9d\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2xjcm\" (UID: \"9a74a1ef-f649-4cdf-8c94-a9609ac8cc10\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm" Apr 21 15:24:31.975319 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.975275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9a74a1ef-f649-4cdf-8c94-a9609ac8cc10-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2xjcm\" (UID: \"9a74a1ef-f649-4cdf-8c94-a9609ac8cc10\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm" Apr 21 15:24:31.975497 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.975371 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ct9d\" (UniqueName: \"kubernetes.io/projected/9a74a1ef-f649-4cdf-8c94-a9609ac8cc10-kube-api-access-8ct9d\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2xjcm\" (UID: \"9a74a1ef-f649-4cdf-8c94-a9609ac8cc10\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm" Apr 21 15:24:31.975624 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.975601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9a74a1ef-f649-4cdf-8c94-a9609ac8cc10-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2xjcm\" (UID: \"9a74a1ef-f649-4cdf-8c94-a9609ac8cc10\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm" Apr 21 15:24:31.986400 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:31.986330 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ct9d\" (UniqueName: \"kubernetes.io/projected/9a74a1ef-f649-4cdf-8c94-a9609ac8cc10-kube-api-access-8ct9d\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2xjcm\" (UID: \"9a74a1ef-f649-4cdf-8c94-a9609ac8cc10\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm" Apr 21 15:24:32.047591 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:32.047559 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm" Apr 21 15:24:32.196746 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:24:32.196713 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a74a1ef_f649_4cdf_8c94_a9609ac8cc10.slice/crio-4cf7328bc43ad98e8c9f51465e7475e0da439bdb88d90ae1b6017030b92f8419 WatchSource:0}: Error finding container 4cf7328bc43ad98e8c9f51465e7475e0da439bdb88d90ae1b6017030b92f8419: Status 404 returned error can't find the container with id 4cf7328bc43ad98e8c9f51465e7475e0da439bdb88d90ae1b6017030b92f8419 Apr 21 15:24:32.197087 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:32.197060 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm"] Apr 21 15:24:32.199650 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:32.199627 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:24:33.010581 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:33.010539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm" event={"ID":"9a74a1ef-f649-4cdf-8c94-a9609ac8cc10","Type":"ContainerStarted","Data":"e86109891dbc76caf79f326f6d15923cd289d12f452f2fdb618ade8c427d87bb"} Apr 21 15:24:33.010581 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:33.010582 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm" event={"ID":"9a74a1ef-f649-4cdf-8c94-a9609ac8cc10","Type":"ContainerStarted","Data":"4cf7328bc43ad98e8c9f51465e7475e0da439bdb88d90ae1b6017030b92f8419"} Apr 21 15:24:33.011011 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:33.010717 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm" Apr 21 15:24:33.034043 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:33.033985 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm" podStartSLOduration=2.033967241 podStartE2EDuration="2.033967241s" podCreationTimestamp="2026-04-21 15:24:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:24:33.033451485 +0000 UTC m=+1719.168429098" watchObservedRunningTime="2026-04-21 15:24:33.033967241 +0000 UTC m=+1719.168944851" Apr 21 15:24:44.016790 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:44.016761 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2xjcm" Apr 21 15:24:52.657786 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:52.657751 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:24:57.766922 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:24:57.766884 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:25:27.269619 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:25:27.269575 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:25:34.592136 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:25:34.592101 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:25:44.340773 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:25:44.340688 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:25:54.315488 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:25:54.315451 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:26:02.817259 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:26:02.817210 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:26:13.870256 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:26:13.870202 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:26:22.160644 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:26:22.160599 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:26:33.064781 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:26:33.064741 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:26:42.267088 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:26:42.267045 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:26:52.468086 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:26:52.468045 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:27:02.058420 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:27:02.058381 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:27:34.564867 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:27:34.564822 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:28:17.193147 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:28:17.193111 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:28:26.367068 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:28:26.367027 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:28:34.158612 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:28:34.158566 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:28:43.859610 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:28:43.859521 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:28:52.571762 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:28:52.571721 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:29:05.262065 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:29:05.262020 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:29:13.454888 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:29:13.454845 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:29:20.955921 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:29:20.955885 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:29:30.710190 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:29:30.710147 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:29:39.071660 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:29:39.071622 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:29:47.473566 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:29:47.473530 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:29:58.463501 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:29:58.463463 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:30:00.153413 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:00.153373 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29613090-wdfz7"] Apr 21 15:30:00.154057 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:00.154036 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9eb900ab-319f-48d8-b6be-bd6bc9cd0989" containerName="cleanup" Apr 21 15:30:00.156984 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:00.156962 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" Apr 21 15:30:00.160733 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:00.160712 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-xx77w\"" Apr 21 15:30:00.187925 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:00.187888 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613090-wdfz7"] Apr 21 15:30:00.260328 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:00.260290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxvzx\" (UniqueName: \"kubernetes.io/projected/9d0d1fe1-9772-43d0-878d-6b205a036748-kube-api-access-fxvzx\") pod \"maas-api-key-cleanup-29613090-wdfz7\" (UID: \"9d0d1fe1-9772-43d0-878d-6b205a036748\") " pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" Apr 21 15:30:00.361772 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:00.361735 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxvzx\" (UniqueName: \"kubernetes.io/projected/9d0d1fe1-9772-43d0-878d-6b205a036748-kube-api-access-fxvzx\") pod \"maas-api-key-cleanup-29613090-wdfz7\" (UID: \"9d0d1fe1-9772-43d0-878d-6b205a036748\") " pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" Apr 21 15:30:00.371398 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:00.371364 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxvzx\" (UniqueName: \"kubernetes.io/projected/9d0d1fe1-9772-43d0-878d-6b205a036748-kube-api-access-fxvzx\") pod \"maas-api-key-cleanup-29613090-wdfz7\" (UID: \"9d0d1fe1-9772-43d0-878d-6b205a036748\") " pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" Apr 21 15:30:00.467146 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:00.467058 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" Apr 21 15:30:00.596832 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:00.596801 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613090-wdfz7"] Apr 21 15:30:00.601498 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:30:00.601457 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d0d1fe1_9772_43d0_878d_6b205a036748.slice/crio-9ce7b29dd5d15a961cb0b2d05faf6e3f6291604c7bbeb01cac65947ce569b507 WatchSource:0}: Error finding container 9ce7b29dd5d15a961cb0b2d05faf6e3f6291604c7bbeb01cac65947ce569b507: Status 404 returned error can't find the container with id 9ce7b29dd5d15a961cb0b2d05faf6e3f6291604c7bbeb01cac65947ce569b507 Apr 21 15:30:00.603292 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:00.603272 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:30:01.279073 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:01.279037 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" event={"ID":"9d0d1fe1-9772-43d0-878d-6b205a036748","Type":"ContainerStarted","Data":"603cd7368921ea663f6d143bea305015355e682e1c023cf357e790247ed6a4cf"} Apr 21 15:30:01.279515 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:01.279079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" event={"ID":"9d0d1fe1-9772-43d0-878d-6b205a036748","Type":"ContainerStarted","Data":"9ce7b29dd5d15a961cb0b2d05faf6e3f6291604c7bbeb01cac65947ce569b507"} Apr 21 15:30:01.294300 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:01.294197 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" podStartSLOduration=1.294183624 podStartE2EDuration="1.294183624s" podCreationTimestamp="2026-04-21 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:30:01.293566385 +0000 UTC m=+2047.428543994" watchObservedRunningTime="2026-04-21 15:30:01.294183624 +0000 UTC m=+2047.429161233" Apr 21 15:30:14.908119 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:14.908025 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:30:21.362547 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:21.362508 2576 generic.go:358] "Generic (PLEG): container finished" podID="9d0d1fe1-9772-43d0-878d-6b205a036748" containerID="603cd7368921ea663f6d143bea305015355e682e1c023cf357e790247ed6a4cf" exitCode=6 Apr 21 15:30:21.362942 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:21.362590 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" event={"ID":"9d0d1fe1-9772-43d0-878d-6b205a036748","Type":"ContainerDied","Data":"603cd7368921ea663f6d143bea305015355e682e1c023cf357e790247ed6a4cf"} Apr 21 15:30:21.363009 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:21.362994 2576 scope.go:117] "RemoveContainer" containerID="603cd7368921ea663f6d143bea305015355e682e1c023cf357e790247ed6a4cf" Apr 21 15:30:22.367681 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:22.367647 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" event={"ID":"9d0d1fe1-9772-43d0-878d-6b205a036748","Type":"ContainerStarted","Data":"b17e90308bd5e87f37586ea6d3746336b90879449f2541897dba6b93cd07450a"} Apr 21 15:30:24.369502 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:24.369460 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:30:32.096161 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:32.096121 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:30:41.087200 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:41.087150 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:30:42.449398 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:42.449304 2576 generic.go:358] "Generic (PLEG): container finished" podID="9d0d1fe1-9772-43d0-878d-6b205a036748" containerID="b17e90308bd5e87f37586ea6d3746336b90879449f2541897dba6b93cd07450a" exitCode=6 Apr 21 15:30:42.451228 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:42.451198 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" event={"ID":"9d0d1fe1-9772-43d0-878d-6b205a036748","Type":"ContainerDied","Data":"b17e90308bd5e87f37586ea6d3746336b90879449f2541897dba6b93cd07450a"} Apr 21 15:30:42.451388 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:42.451261 2576 scope.go:117] "RemoveContainer" containerID="603cd7368921ea663f6d143bea305015355e682e1c023cf357e790247ed6a4cf" Apr 21 15:30:42.451672 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:42.451649 2576 scope.go:117] "RemoveContainer" containerID="b17e90308bd5e87f37586ea6d3746336b90879449f2541897dba6b93cd07450a" Apr 21 15:30:42.452004 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:30:42.451984 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29613090-wdfz7_opendatahub(9d0d1fe1-9772-43d0-878d-6b205a036748)\"" pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" podUID="9d0d1fe1-9772-43d0-878d-6b205a036748" Apr 21 15:30:53.446358 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:53.446322 2576 scope.go:117] "RemoveContainer" containerID="b17e90308bd5e87f37586ea6d3746336b90879449f2541897dba6b93cd07450a" Apr 21 15:30:54.506177 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:54.506135 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" event={"ID":"9d0d1fe1-9772-43d0-878d-6b205a036748","Type":"ContainerStarted","Data":"a8125bcafb2fe2731176b9db4b4033717ff678b5b913d5523d945159dd1ec33f"} Apr 21 15:30:55.537685 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:55.537641 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613090-wdfz7"] Apr 21 15:30:55.538077 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:55.537865 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" podUID="9d0d1fe1-9772-43d0-878d-6b205a036748" containerName="cleanup" containerID="cri-o://a8125bcafb2fe2731176b9db4b4033717ff678b5b913d5523d945159dd1ec33f" gracePeriod=30 Apr 21 15:30:57.965775 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:30:57.965734 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:31:06.470325 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:06.470288 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:31:14.290315 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.290291 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" Apr 21 15:31:14.393781 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.393690 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxvzx\" (UniqueName: \"kubernetes.io/projected/9d0d1fe1-9772-43d0-878d-6b205a036748-kube-api-access-fxvzx\") pod \"9d0d1fe1-9772-43d0-878d-6b205a036748\" (UID: \"9d0d1fe1-9772-43d0-878d-6b205a036748\") " Apr 21 15:31:14.395906 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.395877 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d0d1fe1-9772-43d0-878d-6b205a036748-kube-api-access-fxvzx" (OuterVolumeSpecName: "kube-api-access-fxvzx") pod "9d0d1fe1-9772-43d0-878d-6b205a036748" (UID: "9d0d1fe1-9772-43d0-878d-6b205a036748"). InnerVolumeSpecName "kube-api-access-fxvzx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:31:14.494556 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.494525 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fxvzx\" (UniqueName: \"kubernetes.io/projected/9d0d1fe1-9772-43d0-878d-6b205a036748-kube-api-access-fxvzx\") on node \"ip-10-0-130-121.ec2.internal\" DevicePath \"\"" Apr 21 15:31:14.578754 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.578715 2576 generic.go:358] "Generic (PLEG): container finished" podID="9d0d1fe1-9772-43d0-878d-6b205a036748" containerID="a8125bcafb2fe2731176b9db4b4033717ff678b5b913d5523d945159dd1ec33f" exitCode=6 Apr 21 15:31:14.578754 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.578757 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" event={"ID":"9d0d1fe1-9772-43d0-878d-6b205a036748","Type":"ContainerDied","Data":"a8125bcafb2fe2731176b9db4b4033717ff678b5b913d5523d945159dd1ec33f"} Apr 21 15:31:14.578972 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.578779 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" event={"ID":"9d0d1fe1-9772-43d0-878d-6b205a036748","Type":"ContainerDied","Data":"9ce7b29dd5d15a961cb0b2d05faf6e3f6291604c7bbeb01cac65947ce569b507"} Apr 21 15:31:14.578972 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.578811 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613090-wdfz7" Apr 21 15:31:14.578972 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.578806 2576 scope.go:117] "RemoveContainer" containerID="a8125bcafb2fe2731176b9db4b4033717ff678b5b913d5523d945159dd1ec33f" Apr 21 15:31:14.587468 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.587436 2576 scope.go:117] "RemoveContainer" containerID="b17e90308bd5e87f37586ea6d3746336b90879449f2541897dba6b93cd07450a" Apr 21 15:31:14.595474 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.595450 2576 scope.go:117] "RemoveContainer" containerID="a8125bcafb2fe2731176b9db4b4033717ff678b5b913d5523d945159dd1ec33f" Apr 21 15:31:14.595735 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:31:14.595715 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8125bcafb2fe2731176b9db4b4033717ff678b5b913d5523d945159dd1ec33f\": container with ID starting with a8125bcafb2fe2731176b9db4b4033717ff678b5b913d5523d945159dd1ec33f not found: ID does not exist" containerID="a8125bcafb2fe2731176b9db4b4033717ff678b5b913d5523d945159dd1ec33f" Apr 21 15:31:14.595786 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.595755 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8125bcafb2fe2731176b9db4b4033717ff678b5b913d5523d945159dd1ec33f"} err="failed to get container status \"a8125bcafb2fe2731176b9db4b4033717ff678b5b913d5523d945159dd1ec33f\": rpc error: code = NotFound desc = could not find container \"a8125bcafb2fe2731176b9db4b4033717ff678b5b913d5523d945159dd1ec33f\": container with ID starting with a8125bcafb2fe2731176b9db4b4033717ff678b5b913d5523d945159dd1ec33f not found: ID does not exist" Apr 21 15:31:14.595786 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.595773 2576 scope.go:117] "RemoveContainer" containerID="b17e90308bd5e87f37586ea6d3746336b90879449f2541897dba6b93cd07450a" Apr 21 15:31:14.596055 ip-10-0-130-121 kubenswrapper[2576]: E0421 15:31:14.596036 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b17e90308bd5e87f37586ea6d3746336b90879449f2541897dba6b93cd07450a\": container with ID starting with b17e90308bd5e87f37586ea6d3746336b90879449f2541897dba6b93cd07450a not found: ID does not exist" containerID="b17e90308bd5e87f37586ea6d3746336b90879449f2541897dba6b93cd07450a" Apr 21 15:31:14.596097 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.596060 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b17e90308bd5e87f37586ea6d3746336b90879449f2541897dba6b93cd07450a"} err="failed to get container status \"b17e90308bd5e87f37586ea6d3746336b90879449f2541897dba6b93cd07450a\": rpc error: code = NotFound desc = could not find container \"b17e90308bd5e87f37586ea6d3746336b90879449f2541897dba6b93cd07450a\": container with ID starting with b17e90308bd5e87f37586ea6d3746336b90879449f2541897dba6b93cd07450a not found: ID does not exist" Apr 21 15:31:14.599825 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.599799 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613090-wdfz7"] Apr 21 15:31:14.608997 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:14.608975 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613090-wdfz7"] Apr 21 15:31:15.268955 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:15.268917 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:31:16.451296 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:16.451264 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d0d1fe1-9772-43d0-878d-6b205a036748" path="/var/lib/kubelet/pods/9d0d1fe1-9772-43d0-878d-6b205a036748/volumes" Apr 21 15:31:23.693102 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:23.693070 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:31:32.067221 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:32.067183 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:31:41.504385 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:41.504347 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:31:49.847222 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:31:49.847133 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:32:01.191657 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:32:01.191618 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:32:10.597971 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:32:10.597930 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:32:24.532478 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:32:24.532442 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:32:33.485607 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:32:33.485567 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:32:39.573929 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:32:39.573888 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:32:49.603167 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:32:49.603126 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:32:56.758676 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:32:56.758629 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:33:14.861889 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:33:14.861803 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:33:23.499689 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:33:23.499649 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:33:31.930844 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:33:31.930809 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:33:40.796812 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:33:40.796773 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:34:03.103972 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:34:03.103910 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:34:16.004563 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:34:16.004517 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pl9jd"] Apr 21 15:38:18.058541 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:18.058465 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-khrqf_eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4/manager/0.log" Apr 21 15:38:18.313975 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:18.313898 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-86d7574f74-4cmcv_10e56336-552f-427b-bf92-e8b259d808b4/manager/0.log" Apr 21 15:38:18.439929 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:18.439897 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7hwfx_e6be0935-dd7b-426e-9d30-cb85b0ce12b3/manager/2.log" Apr 21 15:38:18.668061 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:18.667968 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cfc874c8f-nrjzq_d2e64906-90da-499c-b2bf-68ff27e32f24/manager/0.log" Apr 21 15:38:20.692498 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:20.692467 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-2xjcm_9a74a1ef-f649-4cdf-8c94-a9609ac8cc10/manager/0.log" Apr 21 15:38:20.800156 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:20.800121 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-pl9jd_2152ab38-2fb4-4ff0-b163-a898eb6a3258/limitador/0.log" Apr 21 15:38:21.364082 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:21.364051 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-g5mdh_507869c0-c040-4125-b81d-d57c30d60623/discovery/0.log" Apr 21 15:38:21.588734 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:21.588701 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5bcc894b57-mjff6_ac03d2a6-4496-4c70-a2e0-73f30da8e8af/kube-auth-proxy/0.log" Apr 21 15:38:21.698143 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:21.698059 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-489p7_bb55fb0d-f2b6-48fd-8ecc-616afdccee2f/istio-proxy/0.log" Apr 21 15:38:22.167077 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:22.167040 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz_ff75fd08-1777-4ba9-8743-939ef551b3cd/storage-initializer/0.log" Apr 21 15:38:22.174553 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:22.174532 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-t4lvz_ff75fd08-1777-4ba9-8743-939ef551b3cd/main/0.log" Apr 21 15:38:22.756868 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:22.756839 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-zs92g_6ef91b59-0ad5-4b29-9123-dfda088056c4/storage-initializer/0.log" Apr 21 15:38:22.765710 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:22.765689 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-zs92g_6ef91b59-0ad5-4b29-9123-dfda088056c4/main/0.log" Apr 21 15:38:29.749063 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:29.749032 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-d7bn9_04bb98f2-3b25-4aa5-aa5b-4484506ce286/global-pull-secret-syncer/0.log" Apr 21 15:38:29.883974 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:29.883943 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jpxzc_0c7e2c16-e24b-4449-ae17-3e6e83f0e900/konnectivity-agent/0.log" Apr 21 15:38:29.991356 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:29.991325 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-121.ec2.internal_e294e9cc44aca557b7c9191559850248/haproxy/0.log" Apr 21 15:38:34.868914 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:34.868828 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-2xjcm_9a74a1ef-f649-4cdf-8c94-a9609ac8cc10/manager/0.log" Apr 21 15:38:34.894904 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:34.894875 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-pl9jd_2152ab38-2fb4-4ff0-b163-a898eb6a3258/limitador/0.log" Apr 21 15:38:36.423568 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:36.423542 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a10a2180-d386-42f7-81e3-1815d058b44b/alertmanager/0.log" Apr 21 15:38:36.445601 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:36.445576 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a10a2180-d386-42f7-81e3-1815d058b44b/config-reloader/0.log" Apr 21 15:38:36.467833 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:36.467805 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a10a2180-d386-42f7-81e3-1815d058b44b/kube-rbac-proxy-web/0.log" Apr 21 15:38:36.489001 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:36.488979 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a10a2180-d386-42f7-81e3-1815d058b44b/kube-rbac-proxy/0.log" Apr 21 15:38:36.516505 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:36.516480 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a10a2180-d386-42f7-81e3-1815d058b44b/kube-rbac-proxy-metric/0.log" Apr 21 15:38:36.539604 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:36.539575 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a10a2180-d386-42f7-81e3-1815d058b44b/prom-label-proxy/0.log" Apr 21 15:38:36.564213 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:36.564158 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a10a2180-d386-42f7-81e3-1815d058b44b/init-config-reloader/0.log" Apr 21 15:38:36.635117 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:36.635083 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8vc5b_def25c30-d426-4e73-b651-27e69d1ef2aa/kube-state-metrics/0.log" Apr 21 15:38:36.658994 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:36.658955 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8vc5b_def25c30-d426-4e73-b651-27e69d1ef2aa/kube-rbac-proxy-main/0.log" Apr 21 15:38:36.685284 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:36.685150 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8vc5b_def25c30-d426-4e73-b651-27e69d1ef2aa/kube-rbac-proxy-self/0.log" Apr 21 15:38:36.939053 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:36.938976 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zpq9w_b7e187a6-d29a-449a-b0b0-7531acc7f526/node-exporter/0.log" Apr 21 15:38:36.959330 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:36.959304 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zpq9w_b7e187a6-d29a-449a-b0b0-7531acc7f526/kube-rbac-proxy/0.log" Apr 21 15:38:36.980523 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:36.980490 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zpq9w_b7e187a6-d29a-449a-b0b0-7531acc7f526/init-textfile/0.log" Apr 21 15:38:37.259938 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:37.259906 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-wncmm_6b213c96-1aa7-4093-bb3d-e20a524c5c46/prometheus-operator/0.log" Apr 21 15:38:37.282686 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:37.282651 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-wncmm_6b213c96-1aa7-4093-bb3d-e20a524c5c46/kube-rbac-proxy/0.log" Apr 21 15:38:37.320311 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:37.320227 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-47zrq_c9361973-6868-4e2e-bf1e-b99353972328/prometheus-operator-admission-webhook/0.log" Apr 21 15:38:37.346311 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:37.346281 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-594696bdbb-bnj47_18f09002-ebf7-4bc8-872a-a3e22b4843f5/telemeter-client/0.log" Apr 21 15:38:37.369002 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:37.368966 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-594696bdbb-bnj47_18f09002-ebf7-4bc8-872a-a3e22b4843f5/reload/0.log" Apr 21 15:38:37.389742 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:37.389697 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-594696bdbb-bnj47_18f09002-ebf7-4bc8-872a-a3e22b4843f5/kube-rbac-proxy/0.log" Apr 21 15:38:37.420876 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:37.420853 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76449cc8c-qs844_6c13ea85-b27b-4d7b-95f7-e1b478b57c96/thanos-query/0.log" Apr 21 15:38:37.443046 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:37.443023 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76449cc8c-qs844_6c13ea85-b27b-4d7b-95f7-e1b478b57c96/kube-rbac-proxy-web/0.log" Apr 21 15:38:37.468923 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:37.468896 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76449cc8c-qs844_6c13ea85-b27b-4d7b-95f7-e1b478b57c96/kube-rbac-proxy/0.log" Apr 21 15:38:37.496514 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:37.496484 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76449cc8c-qs844_6c13ea85-b27b-4d7b-95f7-e1b478b57c96/prom-label-proxy/0.log" Apr 21 15:38:37.524110 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:37.524025 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76449cc8c-qs844_6c13ea85-b27b-4d7b-95f7-e1b478b57c96/kube-rbac-proxy-rules/0.log" Apr 21 15:38:37.548799 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:37.548751 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76449cc8c-qs844_6c13ea85-b27b-4d7b-95f7-e1b478b57c96/kube-rbac-proxy-metrics/0.log" Apr 21 15:38:38.355328 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.355294 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5"] Apr 21 15:38:38.355750 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.355737 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d0d1fe1-9772-43d0-878d-6b205a036748" containerName="cleanup" Apr 21 15:38:38.355800 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.355752 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0d1fe1-9772-43d0-878d-6b205a036748" containerName="cleanup" Apr 21 15:38:38.355800 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.355761 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d0d1fe1-9772-43d0-878d-6b205a036748" containerName="cleanup" Apr 21 15:38:38.355800 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.355767 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0d1fe1-9772-43d0-878d-6b205a036748" containerName="cleanup" Apr 21 15:38:38.355892 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.355840 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d0d1fe1-9772-43d0-878d-6b205a036748" containerName="cleanup" Apr 21 15:38:38.355892 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.355850 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d0d1fe1-9772-43d0-878d-6b205a036748" containerName="cleanup" Apr 21 15:38:38.355892 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.355857 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d0d1fe1-9772-43d0-878d-6b205a036748" containerName="cleanup" Apr 21 15:38:38.359095 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.359069 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.361486 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.361459 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-b6hvv\"/\"default-dockercfg-kmhbt\"" Apr 21 15:38:38.361596 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.361459 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b6hvv\"/\"openshift-service-ca.crt\"" Apr 21 15:38:38.361869 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.361855 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b6hvv\"/\"kube-root-ca.crt\"" Apr 21 15:38:38.373005 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.372978 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5"] Apr 21 15:38:38.453919 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.453881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rshpc\" (UniqueName: \"kubernetes.io/projected/bd0560de-d324-41be-a9ef-4f6467f37afd-kube-api-access-rshpc\") pod \"perf-node-gather-daemonset-rsrr5\" (UID: \"bd0560de-d324-41be-a9ef-4f6467f37afd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.454372 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.453939 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd0560de-d324-41be-a9ef-4f6467f37afd-lib-modules\") pod \"perf-node-gather-daemonset-rsrr5\" (UID: \"bd0560de-d324-41be-a9ef-4f6467f37afd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.454372 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.453967 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bd0560de-d324-41be-a9ef-4f6467f37afd-podres\") pod \"perf-node-gather-daemonset-rsrr5\" (UID: \"bd0560de-d324-41be-a9ef-4f6467f37afd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.454372 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.454031 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bd0560de-d324-41be-a9ef-4f6467f37afd-proc\") pod \"perf-node-gather-daemonset-rsrr5\" (UID: \"bd0560de-d324-41be-a9ef-4f6467f37afd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.454372 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.454092 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd0560de-d324-41be-a9ef-4f6467f37afd-sys\") pod \"perf-node-gather-daemonset-rsrr5\" (UID: \"bd0560de-d324-41be-a9ef-4f6467f37afd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.554627 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.554587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rshpc\" (UniqueName: \"kubernetes.io/projected/bd0560de-d324-41be-a9ef-4f6467f37afd-kube-api-access-rshpc\") pod \"perf-node-gather-daemonset-rsrr5\" (UID: \"bd0560de-d324-41be-a9ef-4f6467f37afd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.554627 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.554645 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd0560de-d324-41be-a9ef-4f6467f37afd-lib-modules\") pod \"perf-node-gather-daemonset-rsrr5\" (UID: \"bd0560de-d324-41be-a9ef-4f6467f37afd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.554892 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.554667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bd0560de-d324-41be-a9ef-4f6467f37afd-podres\") pod \"perf-node-gather-daemonset-rsrr5\" (UID: \"bd0560de-d324-41be-a9ef-4f6467f37afd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.554892 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.554743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bd0560de-d324-41be-a9ef-4f6467f37afd-proc\") pod \"perf-node-gather-daemonset-rsrr5\" (UID: \"bd0560de-d324-41be-a9ef-4f6467f37afd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.554892 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.554775 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd0560de-d324-41be-a9ef-4f6467f37afd-sys\") pod \"perf-node-gather-daemonset-rsrr5\" (UID: \"bd0560de-d324-41be-a9ef-4f6467f37afd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.554892 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.554796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd0560de-d324-41be-a9ef-4f6467f37afd-lib-modules\") pod \"perf-node-gather-daemonset-rsrr5\" (UID: \"bd0560de-d324-41be-a9ef-4f6467f37afd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.554892 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.554877 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bd0560de-d324-41be-a9ef-4f6467f37afd-proc\") pod \"perf-node-gather-daemonset-rsrr5\" (UID: \"bd0560de-d324-41be-a9ef-4f6467f37afd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.555080 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.554897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd0560de-d324-41be-a9ef-4f6467f37afd-sys\") pod \"perf-node-gather-daemonset-rsrr5\" (UID: \"bd0560de-d324-41be-a9ef-4f6467f37afd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.555080 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.554897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bd0560de-d324-41be-a9ef-4f6467f37afd-podres\") pod \"perf-node-gather-daemonset-rsrr5\" (UID: \"bd0560de-d324-41be-a9ef-4f6467f37afd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.563965 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.563940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rshpc\" (UniqueName: \"kubernetes.io/projected/bd0560de-d324-41be-a9ef-4f6467f37afd-kube-api-access-rshpc\") pod \"perf-node-gather-daemonset-rsrr5\" (UID: \"bd0560de-d324-41be-a9ef-4f6467f37afd\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.668471 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.668378 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:38.804128 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.804089 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5"] Apr 21 15:38:38.807091 ip-10-0-130-121 kubenswrapper[2576]: W0421 15:38:38.807060 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbd0560de_d324_41be_a9ef_4f6467f37afd.slice/crio-38b28b06e67f94cc8b4447ba8ccaef173abac144d8e1330a8058e8e3c35c9c4a WatchSource:0}: Error finding container 38b28b06e67f94cc8b4447ba8ccaef173abac144d8e1330a8058e8e3c35c9c4a: Status 404 returned error can't find the container with id 38b28b06e67f94cc8b4447ba8ccaef173abac144d8e1330a8058e8e3c35c9c4a Apr 21 15:38:38.808906 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:38.808887 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:38:39.296843 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:39.296807 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" event={"ID":"bd0560de-d324-41be-a9ef-4f6467f37afd","Type":"ContainerStarted","Data":"5862b478e04f48e1907ee50c775b5d77cd46cfca86b2af1a2deec0eba1eed2e1"} Apr 21 15:38:39.296843 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:39.296846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" event={"ID":"bd0560de-d324-41be-a9ef-4f6467f37afd","Type":"ContainerStarted","Data":"38b28b06e67f94cc8b4447ba8ccaef173abac144d8e1330a8058e8e3c35c9c4a"} Apr 21 15:38:39.297051 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:39.296872 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:39.315353 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:39.315287 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" podStartSLOduration=1.315268422 podStartE2EDuration="1.315268422s" podCreationTimestamp="2026-04-21 15:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:38:39.314528717 +0000 UTC m=+2565.449506328" watchObservedRunningTime="2026-04-21 15:38:39.315268422 +0000 UTC m=+2565.450246033" Apr 21 15:38:39.666152 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:39.666077 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fbc6bbc86-8x8m9_2adf5643-48d2-4d0b-baa0-940ed6abc933/console/0.log" Apr 21 15:38:39.701722 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:39.701678 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-x9ttl_5510583f-77e8-4c46-ab4c-93c316c34fff/download-server/0.log" Apr 21 15:38:40.989220 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:40.989192 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vbmd9_1ada8e2a-356e-4899-913a-b055b92852e4/dns/0.log" Apr 21 15:38:41.011860 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:41.011831 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vbmd9_1ada8e2a-356e-4899-913a-b055b92852e4/kube-rbac-proxy/0.log" Apr 21 15:38:41.065008 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:41.064976 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kpdvd_dadbd785-1d07-45b6-868c-c95e20421c54/dns-node-resolver/0.log" Apr 21 15:38:41.616062 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:41.616037 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7h2f9_297ac21d-4aa7-488f-8f40-48d7b969036b/node-ca/0.log" Apr 21 15:38:42.599993 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:42.599964 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-g5mdh_507869c0-c040-4125-b81d-d57c30d60623/discovery/0.log" Apr 21 15:38:42.646418 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:42.646393 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5bcc894b57-mjff6_ac03d2a6-4496-4c70-a2e0-73f30da8e8af/kube-auth-proxy/0.log" Apr 21 15:38:42.715490 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:42.715457 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-489p7_bb55fb0d-f2b6-48fd-8ecc-616afdccee2f/istio-proxy/0.log" Apr 21 15:38:43.348989 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:43.348950 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-s7jz7_2ee3aa35-2266-4471-8170-7e506d7cd358/serve-healthcheck-canary/0.log" Apr 21 15:38:44.335837 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:44.335796 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vnnr7_08e42ac9-1105-4882-9063-9f6e3e6d42a6/kube-rbac-proxy/0.log" Apr 21 15:38:44.428985 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:44.428955 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vnnr7_08e42ac9-1105-4882-9063-9f6e3e6d42a6/exporter/0.log" Apr 21 15:38:44.483886 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:44.483854 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vnnr7_08e42ac9-1105-4882-9063-9f6e3e6d42a6/extractor/0.log" Apr 21 15:38:45.312188 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:45.312160 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-rsrr5" Apr 21 15:38:46.876423 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:46.876390 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-khrqf_eb6ef7aa-1de2-4b0f-abf1-895c0c7688a4/manager/0.log" Apr 21 15:38:46.992222 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:46.992178 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-86d7574f74-4cmcv_10e56336-552f-427b-bf92-e8b259d808b4/manager/0.log" Apr 21 15:38:47.045946 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:47.045919 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7hwfx_e6be0935-dd7b-426e-9d30-cb85b0ce12b3/manager/1.log" Apr 21 15:38:47.060229 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:47.060145 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7hwfx_e6be0935-dd7b-426e-9d30-cb85b0ce12b3/manager/2.log" Apr 21 15:38:47.276603 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:47.276567 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cfc874c8f-nrjzq_d2e64906-90da-499c-b2bf-68ff27e32f24/manager/0.log" Apr 21 15:38:55.038832 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:55.038789 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-t6zgc_9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c/kube-storage-version-migrator-operator/1.log" Apr 21 15:38:55.039989 ip-10-0-130-121 kubenswrapper[2576]: I0421 15:38:55.039969 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-t6zgc_9f8ea7ab-3b7d-4a56-ab37-533e373f1a5c/kube-storage-version-migrator-operator/0.log"